Pseudo Opinions on Public Affairs

Embed Size (px)

DESCRIPTION

.

Citation preview

  • American Association for Public Opinion Research and Oxford University Press are collaborating with JSTOR to digitize , preserve and extend access to The Public Opinion Quarterly.

    http://www.jstor.org

    American Association for Public Opinion Research

    Pseudo-Opinions on Public Affairs Author(s): George F. Bishop, Robert W. Oldendick, Alfred J. Tuchfarber and Stephen E. Bennett

    Source: The Public Opinion Quarterly, Vol. 44, No. 2 (Summer, 1980), pp. 198-209Published by: on behalf of the Oxford University Press American Association for Public

    Opinion ResearchStable URL: http://www.jstor.org/stable/2748428Accessed: 01-06-2015 13:33 UTC

    Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at http://www.jstor.org/page/ info/about/policies/terms.jsp

    JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact [email protected].

    This content downloaded from 193.48.45.27 on Mon, 01 Jun 2015 13:33:43 UTCAll use subject to JSTOR Terms and Conditions

  • Pseudo-Opinions on Public Affairs

    GEORGE F. BISHOP, ROBERT W. OLDENDICK, ALFRED J. TUCHFARBER AND STEPHEN E. BENNETT

    I?HE BELIEF that respondents often give opinions on issues, ob- jects, or events they know nothing about has become one of the more ingrained superstitions in survey research. We say "superstition" because, aside from some scattered and not always well-documented accounts, such as the survey cited by Payne (1951), in which over two-thirds of a sample supposedly offered an opinion on a fictitious "Metallic Metals Act," we know very little about this potentially major source of nonsampling error.1 There is even a hint in a recently published investigation by Schuman and Presser (1978) that re- searchers may have exaggerated the extent of "nonattitudes" in the

    1 Hartley's (1946) early studies of college students showed large majorities willing to offer opinions on such fictitious nationalities as the Wallonians. Kolson and Green (1970) discovered similar levels of opinionation in a sample of grade school children for a nonexistent political figure named Thomas Walker. Neither study, however, provides much theoretical guidance as to why this phenomenon occurs (cf. Bogart, 1967).

    Abstract This article reports on the often suspected but rarely researched tendency of survey respondents to give opinions on topics to which they have given little or no thought. The findings, based on a question about a fictitious public affairs issue, do show that the magnitude of the problem is substantial. But the data also demonstrate that this phenomenon does not represent simple random error, reflecting instead basic social-psychological dispositions which can be elicited, unwittingly, in the context of the interview.

    George F. Bishop is a Senior Research Associate, Robert W. Oldendick is a Re- search Associate, and Alfred J. Tuchfarber is Director, Behavioral Sciences Laboratory at the University of Cincinnati. Stephen E. Bennett is Associate Professor of Political Science at the University of Cincinnati. The research reported here was supported by a grant from the National Science Foundation (SOC78-07407). The authors want to thank Howard Schuman and Stanley Presser for their comments on a presentation of a previous version of this paper at the annual conference of the American Association for Public Opinion Research, Buck Hill Falls, Pennsylvania, June 1979.

    Public Opinion Quarterly ? 1980 by The Trustees of Columbia University Published by Elsevier North Holland, Inc. 0033-362X/80/0044-198/$1.75

    This content downloaded from 193.48.45.27 on Mon, 01 Jun 2015 13:33:43 UTCAll use subject to JSTOR Terms and Conditions

  • PSEUDO-OPINIONS ON PUBLIC AFFAIRS 199

    American public (cf. Converse, 1970). For they found that a sizable majority (63 percent) of their respondents were quite willing to vol- unteer "don't know" when asked their opinion of the not-so-well- known military government in Portugal. But impressive as this figure might seem, it was based on a real subject that had been covered in varying degrees by the mass media.

    A more direct approach would be simply to ask people whether they have an opinion about an issue or topic that theoretically does not exist. This, in fact, is what we did as part of a larger set of experiments on the effects of opinion filtering (Bishop et al., 1979). In this report we will look first at how extensive the nonattitude or "pseudo-opinion" problem may be in U.S. surveys. We will then try to identify the principal social and psychological sources of this dis- position, including a partial assessment of the extent to which it reflects deliberate falsification. And finally, we will explore the prob- able consequences of excluding respondents who express such ten- dencies from analysis of other items within the same survey.

    Research Design The data for our analysis come from two separate field experi-

    ments. The first one was part of a broader RDD telephone survey of citizen alienation from local government carried out by the Behavioral Sciences Laboratory in July and August of 1978 (N = 631).2 The second was part of an even larger omnibus vehicle known as the Greater Cincinnati Survey, the interviewing for which took place in November and December of 1978 (N = 1,218).3 In each of these experiments respondents were randomly assigned to one of four con- ditions, and within one of these, to one of two subconditions:

    Subgroup A. Respondents in this group were asked about their opinions on a number of policy issues (see Table 2)4 using a version of

    2 Approximately 63 percent of the respondents in this study were members of a panel being interviewed for a second time, the rest being from a previously uninterviewed control group. The reinterview rate for the (14-month) panel was 62.7 percent. The response rate for the control group was 67.7 percent. There were no significant dif- ferences between these subsamples, however, in the proportions giving an opinion on our fictitious topic. Nor were there any on the major independent variables we analyzed-education, race, and trust.

    3 The response rate for this survey was 73.6 percent. For further documentation, see the technical report for The Greater Cincinnati Survey: November-December, 1978. The population used for this second survey consisted of residents (18 and over) of telephone households in the county, which includes the city, whereas the first survey was limited to telephone households in the city only.

    4 The exact wording of these other items can be obtained by writing to the authors.

    This content downloaded from 193.48.45.27 on Mon, 01 Jun 2015 13:33:43 UTCAll use subject to JSTOR Terms and Conditions

  • 200 BISHOP, OLDENDICK, TUCHFARBER AND BENNETT

    a filter question that had originally appeared in the SRC 1956-1962 American National Election Studies: "Do you have an opinion on this or not?"

    Subgroup B. Here the filter accompanying each issue was the one first introduced by SRC in the 1964 election study: "Have you been interested enough in this to favor one side over the other?"

    Subgroup Cl. Respondents received a variation of the filter cur- rently used by the Michigan Center for Political Studies: "Have you thought much about this issue?"

    Subgroup C2. Respondents were administered a slightly different version of the filter used in Subgroup Cl: "Where do you stand on this issue, or haven't you thought much about it?"

    Subgroup D. This group was not exposed to any filter question when asked about each of the policy issues.

    Embedded within the series of domestic and foreign policy ques- tions that we asked of each group was the following fictitious issue: "Some people say that the 1975 Public Affairs Act should be re- pealed. "

    If the respondent was in the nonfilter condition (D) he or she was immediately asked: "Do you agree or disagree with this idea?" Otherwise in the various filter conditions (A, B, Cl, C2) they were initially screened for having an opinion. And if they did, they were asked: "Do you agree or disagree with the idea that the 1975 Public Affairs Act should be repealed?' 5

    In many of the tables which follow we have pooled the data from the two experiments, largely because of the need for adequate cell sizes. Generally speaking, the pattern of differences among experi- mental conditions in the two studies was highly similar. In some cases data were available from only one of the two experiments, however, and we note that accordingly. The reader should also know that the filter question accompanying the Public Affairs Act (PAA) item was identical in conditions Cl and C2. For this reason we collapse these subgroups in the tables below.

    Findings A substantial number of respondents claimed to have an opinion on

    the Public Affairs Act (Table 1). The estimate from our pooled sample of the Cincinnati area was about a third of the adult population, that

    5An anonymous reviewer suggested that, since this item involves a possible ac- quiescence response set, it may maximize opinion giving. The results in Table 6 below, however, indicate that a more basic social-psychological disposition is implicated.

    This content downloaded from 193.48.45.27 on Mon, 01 Jun 2015 13:33:43 UTCAll use subject to JSTOR Terms and Conditions

  • PSEUDO-OPINIONS ON PUBLIC AFFAIRS 201

    Table 1. Response to a Question about Repealing the 1975 Public Affairs Act by Filter-Nonfilter Condition (Pooled File)

    Filtered Voluntary Condition Agree Disagree DK DK Total (N=) A: "Do you have an opinion

    on this or not?" 3.9Wo 3.2 82.5 10.4 100%o (463) B: "Have you been interested

    enough in this to favor one side over the other?" 3.8% 3.6 76.6 16.0 1O00o (445)

    C1/C2: "Have you thought much about this issue?" 2.9Wo 1.6 85.8 9.7 1O00o (445)

    D: (no filter) 15.6% 17.6 0.0 66.8 100o (467) Note: "Filtered" DK includes respondents who said they did not have an opinion,

    were not interested, or had not thought much about an issue when exposed to an explicit filter. "Voluntary" DK's indicate a nonsubstantive response given after the choices were read-i.e., the usual "don't know"response.

    is, in the absence of an explicit filter.6 Even in the face of such a hurdle, we found that roughly 5 to 10 percent of the respondents would persist in volunteering an opinion. Both of these figures are very similar to those found in a national sample by Schuman and Presser (1980) for a real but highly obscure federal statute called the "Agricultural Trade Act of 1978." Nor is the percentage with opinions on the PAA in our nonfilter condition (33 percent) that different from the percentage giving opinions in the same condition to their previously cited question about the Portuguese military govern- ment (37 percent).7 Hence, the distinction between what is a real topic and what is not may be irrelevant for much of the mass public, for whom political content is of generally low salience.

    Another striking feature of the marginals in Table 1 is the division of opinion among those who volunteered one. The agree/disagree split looks suspiciously close to 50:50 in most conditions; in fact, it is

    6 In Experiment I the figure was 41 percent in the nonfilter condition; in Experiment II, 32 percent. The difference is due largely to the greater proportion of blacks in the former survey who are generally more likely to offer opinions on this item (see Table 4 below). The reader may also be curious to know how these percentages compare to those found for real issues like the ones shown in Table 2 below. In the nonfilter condition we found a range from 85 percent with an opinion on the arms to Turkey item to approximately 97 percent on the affirmative action issue, whereas in the filter condition it ran from about 48 percent to 91 percent on the same two issues, respec- tively (see Bishop et al., 1979). On most other issues we have generally found about 85 to 95 percent of the population willing to volunteer an opinion in the absence of a filter, but which, when introduced, typically takes out about 20 to 25 percent of respondents, a figure that is quite similar to that reported by Schuman and Presser (1978).

    7Schuman and Presser (1980) have pointed out, however, that these superficial similarities in the marginals of real and fictitious issues can conceal important dif- ferences in the relationship of opinions to other variables like education.

    This content downloaded from 193.48.45.27 on Mon, 01 Jun 2015 13:33:43 UTCAll use subject to JSTOR Terms and Conditions

  • 202 BISHOP, OLDENDICK, TUCHFARBER AND BENNETT

    almost exactly that for the sample as a whole, suggesting perhaps that a random model of responding would provide an appropriate fit (cf. Converse, 1964; 1970). But, as we shall see shortly, this is not at all the case, at least not in terms of the way we normally conceptualize item-specific responses.

    Furthermore, we have evidence which indicates that responding to a fictitious item like the Public Affairs Act is not simply a matter of deceit. In our second experiment we tested this proposition by also asking respondents whether they had heard of a nonexistent social service agency called ECTA.8 About 10 percent said they had, but there was no significant association in either the filter or nonfilter conditions between responses to this item and giving an opinion on the PAA. This does not, of course, rule out deception in response to specific items, but it does suggest that it is not a general characteris- tic.

    Of greater significance to many researchers is the question of whether respondents who offer opinions on the Public Affairs Act will do the same on topics that are real but not especially salient in their daily lives. The figures in Table 2 tell us that such people were indeed more likely to express an opinion on all other issues we investigated.9 This was particularly true in the filter condition and for the more abstract matters of policy, such as resumption of arms shipments to Turkey and the SALT negotiations. The most plausible reason for the larger differences in the filter condition is that it acts as a "test" which more sharply differentiates those respondents who tend to volunteer opinions on topics they are unfamiliar with from those who are more willing to acknowledge their ignorance. And apparently the more remote the topic becomes from day-to-day concerns, the greater is the effect of this predisposition.

    SOCIAL AND PSYCHOLOGICAL SOURCES

    Probably our strongest expectation was that respondents with lower levels of education would be most likely to say they had an opinion on the Public Affairs Act. Even though we knew from a number of previous studies that such people were less likely to have opinions on matters of public policy (Faulkenberry and Mason, 1978; Francis and Busch, 1975; J. Converse. 1976-77; Schuman and Presser, 1978), we

    8 ECTA is an acronym for the computer program, Everyman's Contingency Table Analysis, developed by Leo Goodman and his associates at the University of Chicago.

    9 We collapsed across filter conditions because of the need, again, for adequate subgroup sizes.

    This content downloaded from 193.48.45.27 on Mon, 01 Jun 2015 13:33:43 UTCAll use subject to JSTOR Terms and Conditions

  • PSEUDO-OPINIONS ON PUBLIC AFFAIRS 203

    Table 2. Percentage with Opinions on Other Policy Issues, in Filter and Nonfilter Conditions, by Opinion/No Opinion on the Public Affairs Act

    Filter Nonfilter Opiniont No Opinion Opinioni No Opinion

    Policy Issie PAA PAA PAA PAA

    Government vs. private 88.4 68.0 98.1 93.6 solution of problems (86) (1,263) (155) (311)

    Affirmation action 94.2 91.3 97.4 96.5 for blacks (86) (1,266) (155) (310)

    Tax cut if loss of 80.2 71.7 97.4 89.1 jobs for public employees (86) (1,264) (155) (309)

    Government vs. private 90.7 80.9 98.1 90.9 health insurance (86) (1,265) (155) (311)

    Reestablishment of diplo- 71.8 57.2 92.2 84.0 matic relations with Cuba (85) (1,266) (154) (312)

    Resumption of arms 68.2 47.2 94.2 80.4 shipments to Turkey (85) (1,264) (155) (312)

    Experimetnt II SALT negotiations and

    Soviet interference in 91.7 68.1 93.3 88.4 African affairs (48) (831) (89) (216)

    NOTE: Entries are percentages of respondents giving an opinion on an issue (subgroup sizes shown in parentheses).

    reasoned that they would also be less likely to realize that our ques- tions on the PAA was a dummy. In addition, there was evidence from the classic study by Lenski and Leggett (1960) showing lower-status respondents to be more deferential or acquiescent toward middle- class interviewers. Thus we thought they would be more inclined to ''go along" by offering an opinion on "another one of those ques- tions," particularly in the absence of an explicit filter. The results for the nonfilter condition in Table 3 confirm the hypothesis without question, but in the filtered subgroup the relationship is not only much weaker, it tends in the opposite direction, with college-level respondents most likely to offer an opinion. One way to interpret this

    Table 3. Percentages with Opinion on Public Affairs Act, in Filter and Nonfilter Conditions, by Education

    Ediucation Filter ' Nonfilter"

    0- 11 yrs. 6.6 (333) 40.2 (122) 12 3.9 (431) 35.6 (149) 13+ 8.2 (562) 26.2 (189)

    NOTE: X2 for response by education by filter/nonfilter interaction = 12.06, df = 2, p < .01.

    ' X2 = 7.34, df = 2, p < .05 (Gamma = -.14). " X- = 6.93, df = 2, p < .05 (Gamma = .21).

    This content downloaded from 193.48.45.27 on Mon, 01 Jun 2015 13:33:43 UTCAll use subject to JSTOR Terms and Conditions

  • 204 BISHOP, OLDENDICK, TUCHFARBER AND BENNETr

    interaction is in terms of a "floor effect." That is, in the filter condition, there are so few respondents who offer an opinion on the PAA (4 to 8 percent) that there is little opportunity for education (or any other variable) to produce substantial variation, whereas in the nonfilter condition this floor is removed.

    Notice also in Table 3 that using a filter tends to be considerably more effective in reducing the magnitude of volunteered opinions in the less educated subgroups. This may help explain the somewhat higher level of opinion-giving among college-educated respondents; that is, they may be simply less willing to admit that they do not have an opinion on some issue of "public affairs," perhaps because of their need to appear well-informed (see, e.g., Ferber, 1956). Two quite different processes may then be operating to produce the same result: in one case because respondents don't know any better (i.e., the less educated); in the other, because they don't want anyone to know that they "don't know." Or perhaps they are both variations of a more generic process of "saving face."

    A form of face-saving may likewise explain the association between race and opinions on the Public Affairs Act (Table 4). In both the filter and nonfilter conditions blacks volunteered an opinion far more frequently than whites. Our speculation is that it represents an at- tempt by many blacks not to fit stereotypes about being "uninformed" to our presumptively white middle-class interviewing staff . . . "call- ing from the University of Cincinnati."10 If that is true, there are some disturbing implications for the ever-growing use of telephone interviewing, with which it is technically impossible to achieve any adequate matching of race of interviewer with race of respondent.

    Table 4. Percentage with Opinion on Public Affairs Act, in Filter and Nonfilter Conditions, by Race

    Race Filtera Nonfilterb

    Black 11.4 (290) 48.5 (97) White 4.9 (1028) 28.6 (357)

    NOTE: x2 for response by race by filter/nonfilter interaction = .04, df = 1, n.s. ax2 = 15.19, df = 1, p < .001 (Q = .43). bx2 = 12.79, df = 1, p < .001 (Q = .40).

    10 While a respondent's color may not be "visible" over the phone, we have discov- ered an extremely high correlation (Q = .99) between the perceived race of the respondent in the first wave of the panel study we identified in footnote 2 and reported race in the second wave. We suspect that much of this is based on stereotypic, but accurate, inferences by our interviewers from vocal cues; if they were so aware of race, so too must our respondents have been.

    This content downloaded from 193.48.45.27 on Mon, 01 Jun 2015 13:33:43 UTCAll use subject to JSTOR Terms and Conditions

  • PSEUDO-OPINIONS ON PUBLIC AFFAIRS 205

    The only practical alternative would be to increase, substantially, the number of black interviewers on our staffs, randomize properly the assignment of telephone numbers, and control for this extraneous source of variation statistically.

    We also examined the influence of several other sociodemographic variables-age, sex, and income-but none of them was of any real consequence (data not shown here). There was a slight, statistically significant tendency for males in the filter condition to report more frequently having an opinion on the PAA, but this difference was not substantively important. Education and race thus appear to be the principal social sources of this phenomenon.

    The only other variable that we expected to mediate responses to our fictitious issue was interest or involvement in politics, which, while correlated with education, will generally have an independent effect on the expression of opinions on public affairs (see, e.g., Francis and Busch, 1975). Our data (not shown here) supported this expectation, showing a pattern that was very similar to the one we found for education (Table 3).11 In the filter condition it was respon- dents with higher levels of interest in politics who were more likely to offer an opinion on the PAA, whereas the reverse was true in the nonfilter condition. When we controlled education, however, the re- lationship between interest in politics and opinion on the PAA was significant only in the filter condition (data not shown here).

    Our next finding was totally serendipitous. Along with a number of questions concerning political alienation in the first experiment we had included the three-item index used by the Michigan Survey Re- search Center to measure interpersonal trust (see Robinson and Shaver, 1973), or what Rosenberg (1957) had originally developed as the Faith-in-People Scale. When we correlated responses to this index with those on the public affairs question, a highly significant interac- tion emerged: in the filter condition there was virtually no association at all, but in the nonfilter condition mistrustful respondents were more than twice as likely to say they had an opinion on the PAA then those with more faith in other people (Table 5). 12

    To measure political interest we used the standard Michigan CPS question: "Now ... some people seem to follow what's going on in government and public affairs most of the time, whether there's an election going on or not. Others aren't that interested. Would you say that you follow what's going on in government and public affairs most of the time, some of the time, only now and then, hardly at all?" Respondents who said they followed these matters "most of the time" were defined as the high interest group, the rest were collapsed into a low interest category.

    12 We created the index by first summing responses to the three dichotomous items, yielding a range from 0 to 3. We then dichotomized this distribution into the high (2, 3) and low (0,1) trust subgroups shown in Table 5.

    This content downloaded from 193.48.45.27 on Mon, 01 Jun 2015 13:33:43 UTCAll use subject to JSTOR Terms and Conditions

  • 206 BISHOP, OLDENDICK, TUCHFARBER AND BENNETT

    Table 5. Percentage with Opinion on Public Affairs Act, in Filter and Nonfilter Conditions, by Interpersonal Trust

    Trust Filtera Nonfilterb

    High 7.0 (230) 26.3 (76) Low 9.4 (212) 57.3 (75)

    NOTE: X2 for response by trust by filter/nonfilter interaction = 4.03, df = 1, p < .05. aX2 = .60, df = 1, n.s. b X2 = 13.69, df = 1, p < .001 (Q = .58).

    Since we knew from previous investigations that blacks and less educated respondents were generally less trusting of others (Robinson and Shaver, 1973)'3 and since they were also more likely to express opinions on the PAA, we suspected that this finding might be spuri- ous. However, controlling these variables, both separately and simultaneously (data not shown here), had virtually no effect on the original association. Race, education, and trust all appear, then, to produce relatively independent effects on the expression of opinions on a fictitious issue.

    The interesting question, of course, is why people with little faith in others more often give opinions than those with greater faith. The answer perhaps lies in the documented connection between trust and self-esteem (see, e.g., Robinson and Shaver, 1973). Our question about a credible-sounding Public Affairs Act can be viewed as a threat to the sense of personal competence of a misanthropic individual, and so he or she volunteers an opinion as a form of self-protection against being thought "stupid" or "uninformed." This interpretation fits in with our earlier discussion of "saving face" as an explanation for racial and educational differences. Each of these variables may be tapping somewhat different aspects of an underlying factor of self- confidence. It does, after all, take a certain amount of confidence in oneself to acknowledge that one does not have an opinion on some- thing that sounds important.

    Opinions on the PAA, once expressed, may not be just a matter of randomly flipping mental coins (cf. Converse, 1964). Indeed, the data in Table 6 tell us that many respondents also fall back on their general disposition to trust or mistrust other people or institutions-in this case, the government or public sector-when deciding to agree or disagree with the proposition concerning our ambiguous PAA stimulus. Thus we find those low on trust more inclined to agree with the idea of repealing the act and those high on trust more likely to

    13 We found similar associations in our dataset between race and the trust index (Q = -.67, p < .001), and between education and trust (Q =.54, p < .001).

    This content downloaded from 193.48.45.27 on Mon, 01 Jun 2015 13:33:43 UTCAll use subject to JSTOR Terms and Conditions

  • PSEUDO-OPINIONS ON PUBLIC AFFAIRS 207

    Table 6. Substantive Response to Repeal of Public Affairs Act, by Interpersonal Trust and Filter/Nonfilter Conditions (Experiment 1)

    Filter Condition Nonfilter Condition Response High Trust Low Trust High Trust Low Trust

    Agree 25.0% 70.0% 35.0% 41.9% Disagree 75.0 30.0 65.0 58.1 Total 100.0% 100.0% 100.0% 100.0% (N=) (16) (20) (20) (43)

    X2 = 5.51, df = 1, p < .05 (Q= -.75) x2 = .06, df = 1, n.s.

    NOTE: X2 for response by trust by filter/nonfilter interaction = 3.19, df = 1, p = .073.

    disagree, though the relationship reaches statistical significance in the filter condition only.

    SUBSTANTIVE CONSEQUENCES

    Because we already know that respondents who volunteered opin- ions on the Public Affairs Act tended to be black, less educated, less interested in politics, and less trusting of others, it should not surprise us to learn that they also differed in the kinds of substantive re- sponses they gave to other policy items (Table 7). In general, they

    Table 7. Relationships Between Opinion/No Opinion on the Public Affairs Act and Substantive Response to Other Policy Issues by Filter-Nonfilter Condition

    Policy Issue Filter Nonfilter Government vs. private -.29* -.52***

    solution of problems (935) (443) Affirmative action .27 .33*

    for blacks (1,237) (450) Tax cut if loss of jobs - .24 -.47***

    for public employees (975) (428) Government vs. private .26* .27**

    health insurance (1,102) (433) Reestablishment of diplomatic .01 .10

    relations with Cuba (785) (404) Resumption of arms .01 -.06

    shipments to Turkey (655) (397) Experiment II

    SALT negotiations and Soviet interference in -.09 -.11 African affairs (610) (274)

    NOTE: Entries are Yule's Q-coefficients. *p

  • 208 BISHOP, OLDENDICK, TUCHFARBER AND BENNETT

    were more in favor of the "liberal" or progovernment position on domestic issues, an association which is considerably stronger (as we might expect) in the nonfilter condition. In contrast, there was little or no relationship between giving an opinion on the PAA item and responses to the three foreign affairs questions. Moreover, when we controlled education and race (data not shown here), many of the original relations shown in Table 7 remained about the same in mag- nitude, though there were several noticeable specifications. In other words, volunteering an opinion on the PAA appears to be indepen- dently related to expressing a progovernment position on domestic issues.

    These findings led us to speculate whether respondents who an- swered the PAA item might not have derived a good deal of its meaning from the immediately preceding context of the interview schedule, which consisted of the four domestic issues which which it was correlated. Inferring that it had to do with "the government," albeit vaguely, they would thus rely on their general disposition to trust or mistrust this institution in deciding how to answer. So far from being a random or thoughtless reaction, it tells us something about how readily respondents will engender meaning where public opinion analysts think there is none, or where such researchers think there is one, and they substitute another.

    At the same time, we should not lose sight of the unintended consequences of our general failure in public opinion surveys to remove such respondents with a filter question or a dummy item of the kind we have used here. If they are indeed more likely to give a "liberal" response to domestic policy issues, based on some vague sense of trust or mistrust toward the government sector, then we risk distorting, perhaps considerably in the case of more abstract matters, the level of public support for various social and economic programs. To express it still another way, for a significant segment of the population (a third?), we may be measuring not much more than their general positive or negative affect toward the government, rather than specific beliefs about the policy alternatives contained in our ques- tions.

    And yet, how are we to say that such responses, once elicited, are any less "real" than any others? Insofar as they represent more en- during (personality) dispositions, one might even argue that they are more genuine or "substantial" than the opinions elicited by the specific content of public policy items. Ultimately, it comes down to the old question of validity: What is it that we are trying to measure?

    This content downloaded from 193.48.45.27 on Mon, 01 Jun 2015 13:33:43 UTCAll use subject to JSTOR Terms and Conditions

  • PSEUDO-OPINIONS ON PUBLIC AFFAIRS 209

    References Bishop, George F., Robert W. Oldendick, Alfred J. Tuchfarber, and Stephen

    E. Bennett 1979 "Opinion filtering and political attitude structure." Paper presented

    at the annual conference of the Midwest Political Science Associa- tion, Chicago, Illinois, April 1979.

    Bogart, Leo 1967 "No opinion, don't know, and maybe no answer." Public Opinion

    Quarterly 31:331-45. Converse, Jean M.

    1976- "Predicting no opinion in the polls." Public Opinion Quarterly 77 40:515-30.

    Converse, Philip E. 1970 "Attitudes and non-attitudes: continuation of a dialogue." Pp.

    168-89 in E. R. Tufte (ed.) The Quantitative Analysis of Social Problems. Reading, Mass.: Addison-Wesley.

    Faulkenberry, G. David, and Robert Mason 1978 "Characteristics of nonopinion and no opinion response groups."

    Public Opinion Quarterly 42:533-43. Ferber, Robert

    1956 "The effect of respondent ignorance on survey results." Journal of the American Statistical Association 51:576-86.

    Francis, Joe D., and Lawrence Busch 1975 "What we now know about 'I don't know.'" Public Opinion Quar-

    terly 39:207-18. Hartley, Eugene L.

    1946 Problems in Prejudice. New York: Octagon Books. Kolson, Kenneth L., and Justin J. Green

    1970 "Response set bias and political socialization research." Social Sci- ence Quarterly 51:527-38.

    Lenski, Gerhard E., and John C. Leggett 1960 "Caste, class, and deference in the research interview." American

    Journal of Sociology 65:463-67. Payne, Stanley L.

    1951 The Art of Asking Questions. Princeton: Princeton University Press. Robinson, John P., and Phillip R. Shaver

    1973 Measures of Social Psychological Attitudes. (rev. ed.) Ann Arbor: Institute for Social Research, University of Michigan.

    Rosenberg, Morris 1957 Occupations and Values. Glencoe, Illinois: The Free Press.

    Schuman, Howard, and Stanley Presser 1978 "The assessment of 'no opinion' in attitude surveys." Pp. 241-75 in

    K. R. Schuessler (ed.), Sociological Methodology 1979. San Fran- cisco: Jossey-Bass.

    1980 "Public opinion and public ignorance: the fine line between attitudes and non-attitudes." American Journal of Sociology (in press).

    This content downloaded from 193.48.45.27 on Mon, 01 Jun 2015 13:33:43 UTCAll use subject to JSTOR Terms and Conditions

    Article Contentsp. [198]p. 199p. 200p. 201p. 202p. 203p. 204p. 205p. 206p. 207p. 208p. 209

    Issue Table of ContentsPublic Opinion Quarterly, Vol. 44, No. 2 (Summer, 1980) pp. 143-284Front MatterAuthority Systems [pp. 143-163]America's Most Important Problem-A Trend Analysis, 1946-1976 [pp. 164-180]The Social Bases of Environmental Concern: A Review of Hypotheses, Explanations and Empirical Evidence [pp. 181-197]Pseudo-Opinions on Public Affairs [pp. 198-209]Response Styles in Telephone and Household Interviewing: A Field Experiment [pp. 210-222]Refusals: Who, Where and Why [pp. 223-233]Current ResearchEncore! The Forgetful Voter [pp. 234-240]Mass Media Use, Issue Knowledge and Political Involvement [pp. 241-248]Subjective Political Efficacy as a Measure of Diffuse Support [pp. 249-256]Using Practice Interviews to Predict Interviewer Behaviors [pp. 257-260]

    Erratum:Time of Decision and Media Use During the Ford-Carter Campaign [pp. 260]News and Notes [pp. 261-266]The Polls: Changing Attitudes and Policies Toward China [pp. 267-273]In Memoriam: Bernard Berelson, 1912-1979 [pp. 274-275]Book ReviewsCritiques and Celebrations of the Newsmaking Process: An Expository Review [pp. 276-283]

    Book Notes [pp. 283-284]Back Matter