9
Understanding errors, biases that can affect journalists By S. MOLLY STOCKIMQ a n d PAQET H. GROSS In the last fifteen years, psychology has witnessed an explosion of knowledge about how human begins p>rocess information. Much of this new knowledge concerns limitations and biases in percep>tion, memory, and rea- soning. Research on eyewitness testimony has highlighted the distortions in perception and memory that can plague observers, particu- larly observers under stress at the moment of observation or recollection. Similarly, studies on the way fjeople make inferences have shown us that people often favor anecdotal information over more reliable base rate statistical information. Even when instructed to be "objective," they often seek and select data according to preexisting expectations or theories. Other research has documented the diffi- culties (jeople have evaluating risk. Still more research has shown that [seople easily ignore sampling biases, fail to understand regres- sion effects, often believe they "knew some- thing all along" even if they didn't, tend to at- tribute the causes of p>eople's behavior to dispositional rather than to situational fac- tors, and often imagine association between events where none exists. Significantly, many of the errors and biases to which people fall prey only get worse under time constraints. For journalists who pledge allegiance to objectivity and/or fairness as they observe and interpret fseople and events, such knowl- edge is fX3tentially of great relevance. Yet only a small percentage of what we know appears to have found its way into joumalism classrooms, most of it in upper-level or graduate courses that consider how audi- ences, as distinct from journalists, process in- formation. Moreover, hardly any of this knowledge seems to have found its way into courses and textbooks that train students to write, edit, and refxjrt the news. Even those texts that do devote spzce to documented distortions in interviewing and observation (cf. Rivers & Harrington, 1988) have either ignored more recent Findings on cognitive error and bias or drawn upon but a small portion of this research. Given the growing demands on joumal- ists to report and interpret the activities of an increasingly complex society and the demon- strated importance of the news media for setting public and private agendas, this is a shame. It is also a problem we may be able to do something about. In the pages that follow, we will outline some of the more important errors and biases in thinking that psychologists have docu- mented in recent years. We have little formal knowledge about the existence and opera- tion of such biases in journalists (as we have noted, and lamented, in Stocking & Gross, 1988). However, we do know they have been found in a variety of professions and across a variety of tasks (cf. Loftus, 1979; Sims, Gioia, & Associates, 1986; and Rogoff & Lave, 1984), leading us to think they probably show up in joumalists' work as well. We hope that joumalism educators, when alerted to these common errors and biases, will be motivated to leam more about them and seek ways, in classrooms and textbooks, to bring them to the attention of their students. The eyewitness fallacy As everyone knows, "seeing is believing." When someone says, "I saw it with my own eyes," people listen, believe, and remember. Indeed, fisychologists who have studied Stocking is assistantprqfessor of joumalism at Indiana University. Gross is a law student at Columbia University. EDUCATOR/SPRiriQ 1989

Understanding errors, biases that can affect journalistsjonathanstray.com/papers/stocking-and-gross.pdf · statistical information. Even when instructed ... some of the more important

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Understanding errors, biases that can affect journalistsjonathanstray.com/papers/stocking-and-gross.pdf · statistical information. Even when instructed ... some of the more important

Understanding errors, biasesthat can affect journalistsBy S. MOLLY STOCKIMQand PAQET H. GROSS

In the last fifteen years, psychology haswitnessed an explosion of knowledge abouthow human begins p>rocess information. Muchof this new knowledge concerns limitationsand biases in percep>tion, memory, and rea-soning.

Research on eyewitness testimony hashighlighted the distortions in perception andmemory that can plague observers, particu-larly observers under stress at the moment ofobservation or recollection. Similarly, studieson the way fjeople make inferences haveshown us that people often favor anecdotalinformation over more reliable base ratestatistical information. Even when instructedto be "objective," they often seek and selectdata according to preexisting expectations ortheories.

Other research has documented the diffi-culties (jeople have evaluating risk. Still moreresearch has shown that [seople easily ignoresampling biases, fail to understand regres-sion effects, often believe they "knew some-thing all along" even if they didn't, tend to at-tribute the causes of p>eople's behavior todispositional rather than to situational fac-tors, and often imagine association betweenevents where none exists. Significantly,many of the errors and biases to whichpeople fall prey only get worse under timeconstraints.

For journalists who pledge allegiance toobjectivity and/or fairness as they observeand interpret fseople and events, such knowl-edge is fX3tentially of great relevance. Yetonly a small percentage of what we knowappears to have found its way into joumalismclassrooms, most of it in upper-level orgraduate courses that consider how audi-ences, as distinct from journalists, process in-formation.

Moreover, hardly any of this knowledgeseems to have found its way into courses and

textbooks that train students to write, edit,and refxjrt the news. Even those texts that dodevote spzce to documented distortions ininterviewing and observation (cf. Rivers &Harrington, 1988) have either ignored morerecent Findings on cognitive error and bias ordrawn upon but a small portion of thisresearch.

Given the growing demands on joumal-ists to report and interpret the activities of anincreasingly complex society and the demon-strated importance of the news media forsetting public and private agendas, this is ashame. It is also a problem we may be ableto do something about.

In the pages that follow, we will outlinesome of the more important errors and biasesin thinking that psychologists have docu-mented in recent years. We have little formalknowledge about the existence and opera-tion of such biases in journalists (as we havenoted, and lamented, in Stocking & Gross,1988). However, we do know they havebeen found in a variety of professions andacross a variety of tasks (cf. Loftus, 1979;Sims, Gioia, & Associates, 1986; and Rogoff &Lave, 1984), leading us to think they probablyshow up in joumalists' work as well. Wehope that joumalism educators, when alertedto these common errors and biases, will bemotivated to leam more about them and seekways, in classrooms and textbooks, to bringthem to the attention of their students.

The eyewitness fallacyAs everyone knows, "seeing is believing."

When someone says, "I saw it with my owneyes," people listen, believe, and remember.

Indeed, fisychologists who have studied

Stocking is assistantprqfessor of joumalismat Indiana University. Gross is a law studentat Columbia University.

EDUCATOR/SPRiriQ 1989

Page 2: Understanding errors, biases that can affect journalistsjonathanstray.com/papers/stocking-and-gross.pdf · statistical information. Even when instructed ... some of the more important

the impact of eyewitness testimony in jurytrials have found that jurors, when reachinga verdict, give much more weight to eyewit-ness testimony than they do to other kinds ofevidence (Loftus, 1979).

Not surprisingly, journalists seem to un-derstand intuitively the pxDwer of eyewitnessaccounts. Thus, as van Dijk (in press) haspointed out, editors so value first-hand re-ports that they "may even send a specialenvoy to places where already dozens ofother reporters are present." (p. 78)

The problem is that eyewitness accounts,while more convincing than hearsay ac-counts, are not always reliable. Research oneyewitness testimony (c.f., Loftus, 1979) isvery clear about this fact: Observations canvary and err as a funrtion of a variety offactors such as prejudice, temporary expecta-tions, the types of details being obwerved, andstress. It is very easy, in other words, for one'sobservations (and one's memories aboutobservations) to be distorted or flat-out wrong.

In one of the most well-known demon-strations of the fallibility of observations,reported in Loftus (1979), Hastorf and Cantrill(1954) examined people's perceptions of afootball game between Dartmouth and Prin-ceton. The game, one of the dirtiest androughest played by either team, was filmedand shown to students on each campus.Students were instructed to note any ruleinfractions they saw and to rate them as either"flagrant" or "mild."

In spite of instructions to be completelyobjective in their observations, students atthe two schools perceived the infractionsvery differently. More precisely, students atPrinceton saw the Dartmouth players make9.8 infractions, more than twice the numberof infractions they saw their own team make(4.2). Moreover, they saw the Dartmouthplayers make more than twice as manyinfraak>ns as Dartmouth students saw theirteam make. Princeton students tended to ratethe violations by their own team as mild andthose committed by the Dartmouth team asflagrant. Dartmouth students were moreevenhanded when they saw the film, seeingabout the same number of infrartions forboth teams (4.3 by Dartmouth and 4.4 by

Princeton), but they also tended to see theviolations of the opposing team as moreflagrant than violations by their own team.Obviously, people don't always see whatothers see; even when instructed to be objec-tive, they often see what they want to see.

It is perhaps an obvious point that one'sfjersonal prejudices can unconsciously affectthe accuracy of one's observations; personalprejudice is usually what journalism educa-tors mean when they warn about "bias" in thenews. However, research on eyewitnesstestimony (Loflus, 1979) suggests that otherfactors besides personal prejudice can lead tounreliable eyewitness reports, among them:temporary expectations, expectations frompast experience, and stress

Eyewitness accounts can also be unreli-able when the event has been witnessedinfrequently and for a short period of time,when violence has been involved, when along time has elapsed between witnessingthe event and reporting it, or when thewitness is under stress at the time of recollec-tion. Researchers have also found that infor-mation introduced after an event has takenplace can alter the memory of it; thus, if areporter witnesses a concert and then readsa competitor's article on the concert, thereporter may alter his own memory of theevent to conform to information contained inhis competitor's account.

The point in all this (and much more,contained in Loftus, 1979) is simple: People,journalists included, may put more weight oneyewitness accounts than they do on otherkinds of evidence. Unless journalists areaware of the ways that eyewitness accountscan be biased and erroneous, they, like juries,may fallaciously assume that such accountsoffer more truth than they do. Such assump-tions could, in tum, influence reporters to"count on" such accounts and prematurelylimit their repxDrting efforts.

Undeaitilization of statisticsRelated to people's tendency to weight

eyewitness accounts more than other typesof evkdence is a tendency for people to favoranecdotal or case history information overbase rate statistical information (information

EDUCATOR/SPRIMQ 1 9 8 9

Page 3: Understanding errors, biases that can affect journalistsjonathanstray.com/papers/stocking-and-gross.pdf · statistical information. Even when instructed ... some of the more important

about the percentage of cases in the popula-tion).

This tendency to underutilize more reli-able base rate information, to focus on spe-ciric individuals and make less use of infor-mation about the |x>pulation from which in-dividuals come, has been vividly demon-strated in a study by Hamill, Wilson, andNisbett (1980). In this study, subjects read amagazine article about a Puerto Rican womanwho had a number of unmanageable chil-dren by a succession of common-law hus-bands. When this anecdotal case was pre-sented along with base rate statistics indicat-ing that 90 percent of welfare recipients "areoff the welfare rolls by the end of four years,"subjects regarded the case history as moreinformative than the more reliable base ratestatistics. Put another way, the statistical factshad less impact on jseople's views about thelaziness and hopelessness of welfare recipi-ents than did the single vivid case.

Just why people favor anecdotal informa-tion over base rate information is unclear. Itmay be because the information contained inanecdotes is usually more vivid (Nisbett &Borgida, 1975; Nisbett & Ross, 1980), or itmay be that the information in anecdotesseems more relevant (Tversky & Kahneman,1978). But regardless of the precise reason,what the Hamill et al. study and relatedresearch suggests for journalists, who rou-tinely use anecdotes to "jjersonalize" thenews, is a need to handle anecdoul informa-tion with considerable caution.

Some sources are masters of the anecdote.Intentionally or unintentionally, they maypresent anecdoul dau that do not squarewith more abstract statistical information. Ifreporters fall victim to. the tendency to favorvivid anecdoul information over pallid butreliable sUtistics, they, and their audiences inturn, may be misled.

Conflrmation biasIntellectually, most of us probably realize

that preconceived ideas shap>e how we view,interpret, and remember information. Butwhat we may not realize is the extent towhich such notions bias us.

In recent years, psychologists have con-

'. . . preconceived ideascan be very powerfulindeed in shaping whatwe see, understzmd, andremember.'

ducted research revealing that preconceivedideas can be very p>owerful indeed in shapingwhat we see, undersUnd, and remember. Infact, so powerful are their effects that it is hardto imagine they do not affect the work ofjoumalists and journalists-in-training.

The tendency for people to seek, select,and recall dau according to preexistingexpeaations or theories is called the "confir-mation bias." To undersUnd its pervasive-ness, it is imp)orunt to undersUnd how itworks.

Two processes seem imporUnt here. First,when people seek information with respectto one theory, they are unlikely to seekinformation with resp>ect to another theorysimulUneously. People, in short, test theo-ries one at a time, or sequentially. So, forexample, the person who is testing a theoryabout the negative impact of feminism onwomen's lives is unlikely to test theoriesabout its positive impact as well. Similarly,the individual testing the theory that there isa "crime wave" against the elderly (Fishman,1980) is unlikely to test the opposite theorythat there are no crimes against the elderly.

Secondly, as people seek information withwhich to test their theories, they show adramatic tendency to use a theory-confirm-ing strategy. For example, a repxDiter who hastheorized that there is a crime wave againstthe elderly may unconsciously seek outsources that confirm this theory—the poten-tial and actual elderly victims in a bad part oftown, the head of a crime prevention pro-gram for the elderly, etc. Further, the reportermay ask questions of these sources — aboutincreases in reported crimes, efforts to reducecrimes, and the like — that confirm thetheory, without asking probing questions

6 EDUCATOR/SPRiriQ 1989

Page 4: Understanding errors, biases that can affect journalistsjonathanstray.com/papers/stocking-and-gross.pdf · statistical information. Even when instructed ... some of the more important

that might disconfirm the hypothesis (Snyder& Swann, 1979).

Not only may expectations influence thesources to which reporters tum and the typesof questions a reporter may ask, but they alsomay influence journalists' evaluatk>n andselection of data. One pervasive bias inperceivers' decisions about what informationis most relevant or credible is the tendency toregard information that is consistent withone's a priori theories as the worthiest piecesof information (Darley & Gross, 1984; Hayden& Mischel, 1976; Snyder & GangesUd, 1981).Thus, when one is testing a theory about thenature, causes, or outcome of an event, theinformation that will t̂ e selected as mostuseful is information that is consistent withand confirming of one's theory.

Information that runs counter to one'stheories is discounted a number of ways. Forone thing, disconfirming evidence may beregarded as transient or situationally induced.For example, a political figure who is thoughtto be honest and forthright and who is thencaught in lies about events may be regardedas momentarily confused, or without recall,or perhaps induced to perform dishonestlyby misguided advisers or the pressure ofoffice (Ross, Lepper & Hubbard, 1975). Evenif joumalists themselves do not interpret ac-tions in this way, they may regard suchinterpretations by others as highly credibleand so give them prominent play.

Secondly, disconfirming evidence may beregarded as arising from poor or shoddysources. Thus, one may be particularlycritical of the methodology of the disconfir-ming study, and, in fact, so critical that thestudy may be discarded as entirely unreli-able. Reporters may similarly discard sources(persons or resources for data) that are dis-confirming of their theories by virtue of theirjudged unreliability.

Fishman (1980), in his account of how aseries of events in New York City came to belinked together as a "crime wave," noted thatonce a crime wave was established in jour-nalists' minds it took on a life of its own,guiding reporters' perceptions of hitherto un-connected crimes and city polke officers' aswell. "A week and a half after the coverage

started, the police wire was steadily supply-ing the press with fresh incidents almostevery day." Even when a reporter examinedpolice crime statistics and discovered thatcrimes against the elderly had actually de-creased (.not increased) compared to theprevious year, the crime wave theme re-mained in place. As Fishman tells it, "Thereporter was puzzled and eventually decidedto ignore the police figures. He felt they wereunreliable and incomplete, and anyway hehad to do the story as originally plannedbecause the whole issue was too big to passup or p>lay down." (p. 5)

If people, including joumalists, are un-aware of the extent to which they seek toconfirm their expectations and theories, itmay be in part because the processes thatallow this to happen operate below the levelof consciousness. Reporters may honestlybelieve they are objectively considering allsides to an issue, while in practice they areprocessing information in a way that con-firms what they expect or believe.

Misperccptions of riskEvery day, on the news, from their friends,

in the movies, people hear about risk — therisk of diseases, natural disasters, technologi-cal mishap>s, nuclear war, auto emissions,p>assive smoke. And, based on what theyhear, people perceive some risks as greaterthan others. But are they right'

Psychologists have done a great deal ofresearch on pieople's perceptions of risk inrecent years, and their findings include thefollowing:

•Anything that makes a hazard appearvery memorable or imaginable, such as arecent disaster, vivid film, or media coverage,can influence one's perception of risk; thus,someone who has just seen the movie "Jaws"may overestimate the probability of beingattacked by sharks on their vacation to theocean, and someone who has just seen "TheTowering Inferno" may overestimate theprobability of being killed in a high rise fire.

•People overestimate the risk of deathfrom dramatk: or sensational causes, such ashomicide, accidents, cancer, and natural dis-asters, and underestimate the risk of death

EDUCATOR/SPRIMQ 1 9 8 9

Page 5: Understanding errors, biases that can affect journalistsjonathanstray.com/papers/stocking-and-gross.pdf · statistical information. Even when instructed ... some of the more important

from undramatic causes such as diabetes,emphysema, and asthma, causes which killone at a time and are common in nonfatalform.

•When people lack strong opinions abouta hazard, they are highly suscep>tible to theway risk information is presented; thus, theway information is framed (whether, forexample, one says "only about one percentof the nation's five million chemical process-ing units handle hazardous waste materialsthat could result in runaway reaaions," or "asmany as 50,000 of the nation's five millionchemical processing units handles hazardouswaste materials...") can have a major effect onperceptions.

In short, researchers have found thatp>eople have a great deal of difficulty accu-rately perceiving risk (Slovic, 1986).

Not surprisingly, some of these samedifficulties have revealed themselves in newsmedia coverage of risk (cT., Combs & Slovk:,1979). Thus, it appears that joumalists, likelay perceivers, process risk information poorly.

"Reporters obviously need to be educatedin the importance and subtleties of riskstories," psychologist Paul Slovic has writtenin an article deuiling some of the difficultiesinherent in informing and educating thepublic about risk (Slovic, 1986). In obviousagreement, a number of psychologists, masscommunication researchers, and other groupshave produced materials and events, includ-ing articles, p>amphlets, films, and work-shops, to help reporters undersund andclarify risk issues(c.f., Fischoff, 1985a, 1985b;Risk Reporting Projert, 1986; Institute forHealth Policy Analysis, 1984).

Since the news media are a dominantsource of information about risk, it seemsincumbent upon journalism educators to alertstudents to the ways in which they may err(and be influenced by the intentional andunintentional errors and biases of sources) asthey process information about risk. It alsoseems incumbent upon us to alert them tosome of the ways they can minimize sucherrors and biases.

Sample errors and biasesAlthough journalism educators have rec-

ognized for some time the need for journal-ists to pay attention to biases in formal polldata (c.f., Wilhoit & Weaver, 1980), they havegiven relatively little attention to the need forjoumalists to attend to sampling biases inother realms, though they probably should.Researchers have found, for example, thatpeople can and often do ignore biases inexisting samples of information (Fiske &Taylor, 1984).

In one of the few textbook examples ofthis shortcoming in joumalism,Tankard (1976)points to how joumalists covered the Water-gate hearings during the 1970s. One reporter,using letters reportedly sent to Senator Ervin'scommittee, concluded that televised hear-ings were appreciated by audiences, whileanother reporter, judging from call-ins totelevision sutions, concluded just the oppo-site (pp. 51-52). Apparently, neither reporterstopped to consider the inherent biases in thesamples they drew upon in reaching theirconclusions.

Misunderstanding of regressionPeople have a poor understanding of

regression, the fact that extreme events will,on the average, be less extreme when ob-served again. As a result, they often useextreme events to predict future extremeevents (Jennings, Amabile, & Ross, 1982). Injoumalism, regression effects are sometimesappreciated, as when a literary critic ravesatK>ut a first book but at the same time urgesreaders to wait for the next book becauseprevious experience suggests that secondnovels often are not as good as first ones.However, other times they may not be appre-ciated, as when reporters herald the first verypositive results of a study on a new drug. Thestudy may show px>sitive results, even greatresults, but it is only one study, and chancealone may have accounted for the findings.

Hindsight biasPsychologists have found that once people

leam the outcome of an event, they tend toexaggerate their ability to have foreseen it.Thus, in one experiment, students were askedto predia the likelihood of various possibleoutcomes of President Nixon's forthcoming

8 EDUCATOR/SPRINQ 1989

Page 6: Understanding errors, biases that can affect journalistsjonathanstray.com/papers/stocking-and-gross.pdf · statistical information. Even when instructed ... some of the more important

trips to Moscow and Peking. When, in thewake of Nixon's journey, the students wereunexpectedly asked to recall their initialpredictions, they remembered them as veryclose to what they now knew had happened(Fischoff & Beyth, 1975). This tendency tooverestimate what could have been foreseenis known as the "hindsight bias" or the "I-knew-it-all-along" phenomenon, and it seemsto have been operating in much of the posthoc moralizing about journalistic coverage ofthe shuttle disaster (I knew or would haveknown, so others should have too); it mayalso have been operating when Sen. WilliamProxmire bestowed "Golden Fleece" awardson research that documented or explored the"obvious," though many joumalists didn'trecognize the fact.

Illusory correlationSometimes, psychologists have found,

people greatly overestimate the frequencywith which two characteristics or events arerelated, or they even impose a relationshipwhere none exists (c.f., Crocker, 1981; Jen-nings, Amabile, & Ross, 1982). This phe-nomenon, known as illusory correlation, hasbeen demonstrated in a number of circum-sunces, but particularly when two thingshappen to be associated in meaning. Thus,if joumalists commonly associate "longhairs"with demonstrations (that is, they expect tosee people with long hair at demonstrations),they may overestimate the frequency withwhich long-haired people attend such events.Likewise, if reporters associate university-based artists with sute grants for the arts, theymay overestimate the frequency with whichsuch artists (as distinct from community-based artists) have been awarded such grants.

Some of the research on illusory correla-tion (c.f.. Chapman & Chapman, 1982) ap-pears to suggest that expecutions can alsolead people to impose illusory causal (asdistinct from simple correlational) relation-ships between characteristics or events.Consider the journalist who notices that anabnormally high number of children livingnear a chemical spill have been bom withbirth defects. In the reporter's mind, chemi-cal accidents often cause human health prob-

'. . . we put great stockin observable factswithout adequatelyemphsizing some of theways observations canvary and err.'

lems; if there is an increase in birth defectsfollowing a chemical accident, it would appearthat the increase must be due to the accident.In the face of such strong prior expectations,it may never even occur to the reporter thata simple increase in the birth rate may havecaused the unusually high number of defects.

Fundzimental attribution errorIn journalism texts and journalism class-

rooms, we do considerable ulking aboutjournalists' role as "interpreters" of behaviorand events; we tell students not only to reportbehavior and events but also to explain thereasons behind behavior and events. But formany of us, that is as far as our training Ukesus. We provide no real undersUnding ofwhat goes into making causal inferences andthe biases and errors that can, and often do,occur. As a result, we send out students who,in their role as interpreters, are likely to com-mit a number of cognitive errors. Some ofthese errors are documented in attributiontheory research, which concems how peoplemake causal inferences about their own andothers* behavior.

Research on one such error, the ubiqui-tous "fundamenul attribution error," sug-gests that one is more likely to attributeanother person's behavior to his or her owndisp>ositional qualities than to situationalfactors (Jones & Hanis, 1967). Thus, report-ers discovering a case of scientific fraud attheir local university are more likely to ex-plore the theory that the person was a "badapple" than to explore the theory that the"barrel itself was rotten." There are docu-mented exceptions to the fundamenul attri-

EDUCATOR/SPRIMQ 1989

Page 7: Understanding errors, biases that can affect journalistsjonathanstray.com/papers/stocking-and-gross.pdf · statistical information. Even when instructed ... some of the more important

bution error, of course, but the tendencyneeds to be recognized.

ConclusionThere are numerous other documented

biases and errors in people's thinking (seeFiske & Taylor, 1984). But our point, wehope, has been made: In most skills coursesand most joumalism texts, we put great stockin observable facts without adequatelyemphasizing some of the ways observationscan vary and err. We warn our students to bewary of their own biases, but we appear to dovery little to demonstrate how easily biasesand prior exp>ectations can affect their selec-tion of data. We encourage students to"personalize" the news but fail to warn themadequately of the dangers, often unwittinglyreinforcing the cognitive tendency to favorcase histories or anecdotal information overimportant base rate information. In the samevein, we encourage coverage of activities thatpose social or environmenul risks, yet saylittle or nothing about the errors p>eoplecommonly make when assessing risk. Inthese and many other small ways, we may beunknowingly failing our students and ourprofession.

The need — to leam more about sucherrors and biases, and to bring what we knowand leam into the educational setting — isgreat. Admittedly, it may not be easy. Mostof us have all we can do in a term to teachstudents the traditk>nal fundamentals of in-terviewing, observation, and mining docu-ments. To bring this knowledge into theclassroom successfully will require us todevelop exercises that will provide opportu-nities for learning about cognitive distortionswithout sacrificing the basics.

Teachers' manuals for undergraduate textsin social [psychology often contain exercisesfor demonstrating such errors and biases;some of these might be modified for use inwriting, rep>orting, and editing classes, butothers will have to be developed from scratch.We must also work to integrate this andrelated knowledge into traditional joumalismtextbooks. Some materials produced bycognitive social psychologists are accessibleenough to use as readings for undergraduate

joumalism majors(c.f., Loftus, 1979; SUnovich,1986), but most are not. It is imperative thatthose capable of translating psychologicalresearch be involved in producing and test-ing classroom-appropriate materials.

Whether joumalists and joumalists-in-train-ing will be receptive to efforts to meet thisneed is another matter; another cognitivetendency that psychologists have documentedis the "overconfkdence phenomenon," or thetendency for people to overestimate the ac-curacy of their judgments (Fischoff, Slovk: &Lichtenstein, 1977). Also uncertain is whetherjoumalisls, even if receptive, will be able toinhibit these errors and biases, pnrtknilarlyunder time constraints, which have beenfound to exaggerate such problems (Kruglan-ski & Freund, 1983).

Still, there is evidence that, at least undersome circumstances, fjeople can inhibit someof these biases. Simply telling p>eople to beunbiased won't lessen people's tendency tomaintain a theory in the face of disconfirmingevklence. However, some research suggeststhat telling people to consider carefully howthey are evaluating evidence and to watchtheir biases as they go through the process ofinterpreting dau does help G-ord, Lepper, &Thompson, 1980). Asking people to explainwhy the theory might be wrong has alsoworked in some experiments (Anderson,1982).

By doing nothing, we can be almostcertain that joumalists, to the extent that theydo make these misukes, will be condemnedto repeat them.

Anderson, C A. (1982). Inoculation and counter-explanation: Debiasing techniques in the perse-verance of social theories. Social Cognition, 1,126-139.

Chapman. L.J., & Chapman, J.P. (1982). Testresults are what you think they are. In D.Kahneman, P. Slovic, & A. Tversky (Eds), Judg-ment under Uncertainty: Heuristics and Biases.N.Y.: Cambridge University Press, 239-248.

Combs, B. & Slovic, P. (1979). Newspapercoverage of causes of death. Joumalism Quar-terly, 56, 837-843.

Crocker, J. (1981). Judgment of covariation bysocial perceivers. Psychological Bulletin, 90, 272-292.

10 EDUCATOR/SPRIHQ 1 9 8 9

Page 8: Understanding errors, biases that can affect journalistsjonathanstray.com/papers/stocking-and-gross.pdf · statistical information. Even when instructed ... some of the more important

Darley, JJVl. &Gross, P.H. (1983). A hypothesis-confirming bias in labeling effects," Journal ofPersonality and Social Psychology, 44, 20-33.

Fischoff, B. (1985a). Environmental reporting:What to ask the experts. The Journalist, >XTnt<a-,11-15.

Fischoff, B. (1985b). Cognitive and institutionalbarriers to 'informed consent." In M. Gibson (Ed.),Risk, Consent, and Air, Totowa, N.J.: Rowman &Allenheld.

Fischoff, B. & Beyth, R. (1975). 'I knew it wouldhappen"; Remembered probabilities of once fu-ture things. Organizational Behavior and Hu-man Performance, 13: 1-16.

Fischoff, B., Slovic, P. & Uchtenstein, S. (1977).Knowing with certainty: The appropriateness ofextreme confidence. Journal of EifterimentalPsychology: Human Perc^tion arid Performance,3, 552-564.

Fishman, M. (1980). Manufacturing the News.Austin, Tex.: University of Texas Press.

Fiske, S.T. & Taylor, S.E. (1984). Social Cogni-tion. Reading, Mass.: Addison-Wesley PublishingCompany.

Hamill, R., Wilson, T.D., & NUbett, R.E. (1980).Insensitivity to sample bias: Generalizing fromatypical cases. Journal of Personality and SocialPsychology, 39, 578-589.

Hastorf, A. & Cantrill, H. (1954). They saw agame: A case study. Journal ofAhnormal andSocial Psychology, 49, 129-134.

Hayden, T. & Mischel, W. (1976). Maintainingtrait consistency in the resolution of behavioral in-consistency: The wolf in sheep"s clothing.' Jour-rutl of Personality and Social Psychology, 44, 109-132.

Institute for Health Policy Analysis. (1985).Health Risk Reporting; Roundtahie Workshop onthe Media and Reporting ofRisks to Health. Wash-ington, D.C.: Georgetown University MedicalCenter Institute for Health Policy Analysis.

Jennings, D.L, Amabile, TM. & Ross, L. (1982).Informed covariation assessment: Data-based vs.theory-based judgments. In D. Kahneman, P.Slovic, and A. Tversky (Eds.), Judgment underUncertainty: Heuristics and Biases. N.Y.: Cambr-idge University Press.

Jones, E.E. & Harris, VA. (1967). The attributionof attitudes. Journal of Experimental Social Psy-chology, 3, 1-24.

Kruglanski, A.W. & Freund, T. (1983). Thefreezing and unfreezing of lay inferences: Effectson impressional primacy, ethnic stereotyping andnumerical anchoring. Journal of ExperimentalSocial Psychology, 19, 448-168.

Loftus, E.F. Eyewitness Testimony. (1979).Cambridge, Mass.: Harvard University Press.

Lord, C.G., Lepper, M.R. & Thompson, W.C.(1980, September). Inhibiting biased assimilationin the consideration of new evidence on social

policy issues. Paper presented at the meeting ofthe American Psychological Association, Mon-treal, Canada. (CUed in Fiske and Taylor, 1984).

Myers, D.G. Social Psychology. (1983). NewYork: McGraw-Hill Book Company.

Nisbett, RE. & Borgida, E. (197S). AttribuUonand the psychology of prediction. Journal ofPersonality and Social Psychology, 32, 932-943.

Nisbea, R.E. & Ross, L. (1980). Human Itrfer-ence: Strategies and Shortcomings of Social Judg-ment. Englewood Cliffs, N.J.: Prentice-Hall.

Risk Reporting Project. (1986). Piscataway, NJ.:University of Medicine and Dentistry of NewJersey-Rutgers Medical School.

Rivers, W.L. & Harrington, S.L. (1988). FindingFacts: Research Writing Across the Curriculum(Second Edition). Englewood Cliffs, N.J.

Rogoff, B. & Lave, J. (1984). Everyday Cogni-tion: Its Development in Social Context. Cambr-idge, Mass.: Harvard University Press.

Ross, L, Lepper, MR. & Hubbard, M. (1975).Perseverance in self-perception and social per-ception: Biased attribution processes in the de-briefing paradigm. Journal of Personality andSocial Psychology, 32, 880-892.

Sims, H., and Gioia, D.A. & Associates. (1986).The Thinking Organization. San Francisco: Jos-sey-Bass.

Slovic, P. (1986). Informing and Educating thePublic about Risk. Risk Analysis, 6, 4, 403-415.

Snyder, M.&Gangesud, S. (1981). Hypothesis-tesiing processes. In J.H. Harvey, W. Ickes, & R.FKidd (Eds.), New Directions in Attrihution Re-search: Vol. 3. Hillsdale, NJ. : Erlbaum.

Snyder, M. & Swam, W.B. Jr. (1978). Hypothe-sis-testing processes in social interaction. Journalof Personality and Social Psychology, 36, 1202-1212.

Stanovich, K.E. How to Think Straight AboutPsychology. (1986). Glenwood, 111.: Scott Fores-man.

Stocking, S. H. & Gross, PH. (1988). HowJoumalists Think: A Profwsal for the Study ofCommunicatory Behavior. Paper presented toTheory and Methodology Division, Associationfor Education in Journalism and Mass Communi-cation annual meeting Portland, OR.

Tankard, J.W. (1976). Reporting and the scien-tific method. In M. McCombs, D.L Shaw, & D.Grey, Handbook of Retorting Methods. Boston:Houghton Mifflin Company.

Tversky, A. & Kahneman, D. (1978). Causalschemata in judgments under uncertainty. In M.Fishbein (Ed), Progress in Social Psychology.HiUsdale, NJ.: Erlbaum.

van Dijk, T.A. (in press). News as Discourse.New York: Erlbaum.

Wilhoit, G.C & Weaver, D.H. (1980). NewsroomCuide to Polls & Surveys. Washington, D.C.:American Newspap)er Publishers Association.

EDUCATOR/SPRIMQ 1989 11

Page 9: Understanding errors, biases that can affect journalistsjonathanstray.com/papers/stocking-and-gross.pdf · statistical information. Even when instructed ... some of the more important