Upload
others
View
3
Download
0
Embed Size (px)
Citation preview
Political Science Scope and Methods
Devin Caughey and Rich Nielsen
MIT | 17.850 | Fall 2019 | Th 9:00–11:00 | E53-438http://stellar.mit.edu/S/course/17/fa19/17.850
Contact Information
Devin Caughey Rich NielsenEmail : [email protected] [email protected]: E53-463 E53-455
Office Hours: Th 11–12 or by appt. Th 3–4 or by appt.(sign up on my door) (sign up on my door)
Course Description
This course provides a graduate-level introduction to and overview of approaches to polit-ical science research. It does not delve deeply into normative political theory or statisticsand data analysis (both of which are covered in depth by other courses), but it addresses al-most all other aspects of the research process. Moreover, aside from being broadly positivistin orientation, the course is otherwise ecumenical with respect to method and subfield. Thecourse covers philosophy of science, the generation of theories and research questions, con-ceptualization, measurement, causation, research designs (quantitative and qualitative),mixing methods, and professional ethics. The capstone of the course is an original researchproposal, a preliminary version of which students (if eligible) may submit as an NSF re-search proposal (at least 8 students have won since 2014). In addition to this assignment,the main requirement of this course is that students thoroughly read the texts assignedeach week and come to seminar prepared to discuss them with their peers.
Expectations
• Please treat each other with respect, listen attentively when others are speak-ing, and avoid personal attacks. Everyone should feel comfortable expressing theiropinions, political or otherwise, as long as they do so in an appropriate manner.
1
• Laptops, phones, and other electronic devices should be turned off andput away during class unless we ask you to take them out. This requirement mayseem old-fashioned, but we believe the best way to foster discussion and mutualengagement. If you need electronics, please come talk to us outside of class.
• We do not tolerate plagiarism. Never take credit for words or ideas that are notyour own, and always give your readers enough information to evaluate the sourceand quality of your evidence. Self-plagiarism (reusing material you have written inanother context) is also prohibited without prior permission from the instructors.For more information on academic integrity, consult http://integrity.mit.edu/.
Goals
• Learn the methodological canon of political science. How do we think we know whatwe think we know? Where has the field been and where is it going? How do youevaluate others’ research? How do you design successful research yourself?
• Produce a professional-grade research proposal.
• Write well. Your research will be better and more influential if you can expressyourself well. We expect clear, active prose. Avoid jargon and passive voice wherepossible. These resources might help:
Verlyn Klinkenborg. 2013. Several Short Sentences About Writing. New York:Vintage
Helen Sword. 2012. Stylish Academic Writing. Cambridge, MA: Harvard Uni-versity Press
John Van Maanen. 2011. Tales of the Field: On Writing Ethnography. Chicago:University of Chicago Press
William Germano. 2014. From Dissertation to Book. Chicago: University ofChicago Press
Joan Bolker. 1998. Writing Your Dissertation in Fifteen Minutes a Day: A Guideto Starting, Revising, and Finishing Your Doctoral Thesis. New York: Holt
Steven Pinker. 2014. The Sense of Style: The Thinking Person’s Guide to Writ-ing in the 21st Century. New York: Penguin
• Win grants and fellowships to support your research.
2
Grades
You should worry more about learning the material than about your grade, but we arerequired to give grades. Graduate school grades tend to be higher that what you may beused to. An A+/A means “excellent/good” and a B+/B/B− means “adequate/fair/poor.”If we give you a C or lower, we need to talk.
Assignments (see schedule for due dates):
• 5% One- to three-paragraph description of proposed research.
• 0% Introduction and theory section of research proposal. This assignment is un-graded, unless you don’t turn it in, in which case it’s worth −10%. Our goal is togive you feedback that helps with the grant proposal and final assignment.
• 25% Grant proposal. Write this proposal to the length required by the funding agency,but hopefully around 2–4 pages, 800 to 1600 words). Select a grant or fellowship suchas the NSF GRFP competition and write a proposal in the required style. If youselect a fellowship other than the NSF GRFP, please clear it with us ahead of timeand submit information about the fellowship’s required style with your assignment.Students who have submitted proposals in the past have been very competitive.
• 0% Draft of grant proposal (optional). If you turn in a draft we’ll give you feedback.
• 50% Research design assignment. Approximately 15 to 22 pages (5000 to 7000words). Ideally, this should read like the first two thirds of a research article, up to(but not including) the point where you would present the findings.
• 15% Class participation and mastery of the material. Read the required material andbe prepared to discuss it. For those of you less comfortable with seminar participa-tion, now is the time to practice. Aim to speak up at least three times per session.To avoid dominating the discussion, let others speak if you’ve spoken recently.
• 5% Complete COUHES training.
Books for Purchase
John Gerring. 2012. Social Science Methodology: A Unified Framework. 2nd ed. NewYork: Cambridge University Press
Zoltan Dienes. 2008. Understanding Psychology as a Science: An Introduction toScientific and Statistical Inference. New York: Palgrave Macmillan
Gary King, Robert Keohane, and Sidney Verba. 1994. Designing Social Inquiry: Sci-entific Inference in Qualitative Research. Princeton, NJ: Princeton University Press
3
Semester Overview
1 September 5: Becoming a Political Scientist
2 September 12: Philosophy of Science→ Complete MIT COUHES training before the start of class . . . . . . . . . . .
3 September 19: Theory→ One- to three-paragraph description of proposed research topic due . . . . . .
4 September 26: Appraisal
5 October 3: Conceptualization and Description
6 October 10: Measurement→ Introduction and theory section of research proposal due . . . . . . . . . . . .
7 October 17: Causation and Causal Arguments→ Draft of grant proposal due (optional) . . . . . . . . . . . . . . . . . . . . . .
8 October 24: Quantitative Analysis I—Experiments→ NSF proposal due (official application deadline: 5pm) . . . . . . . . . . . . . .
9 October 31: Quantitative Analysis II—Confounders, Colliders, Mecha-nisms, and Instruments
10 November 7: Qualitative Analysis I—The Methods Wars
11 November 14: Qualitative Analysis II—Case Studies and Process Trac-ing
12 November 21: Qualitative Analysis III—Fieldwork and Interpretation
November 28: NO CLASS (Thanksgiving)
13 December 5: Mixing Methods (Or Not)
December 12: NO CLASS→ Research design assignment due . . . . . . . . . . . . . . . . . . . . . . . . . .
4
Course Schedule
1 September 5: Becoming a Political Scientist
What does it mean to study politics scientifically (as well as ethically)? What are someexamples of good work and successful careers in political science? What skills and traitsshould a political scientist develop? What does a grant proposal look like?
Required Readings (144 pages)
Textbook
PDF John Gerring. 2012. Social Science Methodology: A Unified Framework. 2nd ed.New York: Cambridge University Press, xix–xxiii (Preface) and 1–23 (ch. 1: “AUnified Framework”)
Exemplars
PDF Kenneth J. Arrow, Robert O. Keohane, and Simon A. Levin. 2012. “Eli-nor Ostrom: An Uncommon Woman for the Commons.” Proceedings of the NationalAcademy of Sciences 109 (33): 13135–13136
PDF David R. Mayhew. 2015. “Robert A. Dahl: Questions, Concepts, Proving It.”Journal of Political Power 8 (2): 175–187
Ethics
Gary King and Melissa Sands. 2015. “How Human Subjects Research Rules MisleadYou and Your University, and What to Do About It.” August 15. https://gking.harvard.edu/files/gking/files/irb_politics_paper_1.pdf
Jesse Singal. 2015. “The Case of the Amazing Gay-Marriage Data: How a Gradu-ate Student Reluctantly Uncovered a Huge Scientific Fraud.” New York Magazine,May 29. http://nymag.com/scienceofus/2015/05/how- a- grad- student-
uncovered-a-huge-fraud.html
Workflow and Reproducibility
PDF Allan Dafoe. 2014. “Science Deserves Better: The Imperative to Share Com-plete Replication Files.” PS: Political Science & Politics 47 (1): 60–66
PDF Tim Buthe and Alan M. Jacobs, eds. 2015. “Symposium: Transparency inQualitative and Multi-Method Research.” Qualitative & Multi-Method Research 13
5
(1). https://www.maxwell.syr.edu/moynihan/cqrm/Qualitative_Methods_
Newsletters/Table_of_Contents_13_1/, 2–8 (Introduction), 52–64 (Conclusion)
Kieran Healy. 2018. “The Plain Person’s Guide to Plain Text Social Science.” April 28.http://plain-text.co (SKIM)
Examples of NSF GRFP proposals
PDF Blair Read (successful proposal)
PDF Mina Pollmann (successful proposal)
PDF Anonymous 1 (unsuccessful proposal)
PDF Anonymous 2 (unsuccessful proposal)
2 September 12: Philosophy of Science
→ Complete MIT COUHES training before the start of class
What are the philosophical underpinnings of science? How does science work in practice?What, if any, are the differences between the natural and social sciences?
Required Readings (154 pages)
Overview of Philosophy of Science
PDF Dienes, Understanding Psychology, 1–32 (ch. 1: “Karl Popper and Demarca-tion”) and 33–54 (ch. 2: “Kuhn and Lakatos: Paradigms and Programmes”)
Classic Perspectives on Philosophy of Science
PDF Karl Popper. 1976. “Some Fundamental Problems in the Logic of ScientificDiscovery.” In Can Theories Be Refuted? Essays on the Duhem-Quine Thesis, editedby Sandra G. Harding, 89–112. Dordrecht, Holland: D. Reidel. Originally publishedin 1935
PDF Imre Lakatos. 1976. “Falsification and the Methodology of Scientific ResearchProgrammes.” In Can Theories Be Refuted? Essays on the Duhem-Quine Thesis,edited by Sandra G. Harding, 205–259. Dordrecht, Holland: D. Reidel. Originallypublished in 1970
PDF Paul K. Feyerabend. 1993. Against Method. 3rd ed. New York: Verso. Origi-nally published in 1975, 5–8 (“Analytical Index”)
6
PDF David A. Freedman. 2010. “On Types of Scientific Inquiry: The Role of Qual-itative Reasoning.” Chap. 20 in Statistical Models and Causal Inference: A Dialoguewith the Social Sciences, edited by David Collier, Jasjeet S. Sekhon, and Philip B.Stark, 337–356. New York: Cambridge University Press. Originally published in 2008
Philosophy of Social Science
PDF Gerring, Social Science Methodology, 27–36 (part of ch. 2: “Beginnings”)
Additional Resources
Karl Popper. 2002. The Logic of Scientific Discovery. New York: Routledge
Thomas Kuhn. 2012. The Structure of Scientific Revolutions. Chicago: University ofChicago Press
Sandra G. Harding, ed. 1976. Can Theories Be Refuted? Essays on the Duhem-QuineThesis. Dordrecht, Holland: D. Reidel
Paul K. Feyerabend. 1976. “The Rationality of Science (From ‘Against Method’).” InCan Theories Be Refuted? Essays on the Duhem-Quine Thesis, edited by Sandra G.Harding, 289–315. Dordrecht, Holland: D. Reidel
Gabriel A. Almond and Stephen J. Genco. 1977. “Clouds, Clocks, and the Study ofPolitics.” World Politics 29 (4): 489–522
Gerring, Social Science Methodology, 394–401 (Postscript: “Justifications”)
Bruce Bueno de Mesquita. 2004. “The Methodical Study of Politics.” In Problemsand Methods in the Study of Politics, edited by Ian Shapiro, Rogers M. Smith, andTarek E. Masoud, 227–247. New York: Cambridge
3 September 19: Theory
→ One- to three-paragraph description of proposed research topic due
Gerring divides the scientific process into two stages: discovery and appraisal. This weekfocuses on discovery—the generation of new questions, models, theories, hypotheses, andarguments. How do you come up with research questions? Is formalizing theory worthit? What do we gain from deductive and inductive arguments? Which is better, a simpletheory or a complex one?
7
Required Readings (157+ pages)
Overview
Gerring, Social Science Methodology, 37–57 (second part of ch. 2: “Beginnings”) and58–73 (ch. 3: “Arguments”)
Perspectives
PDF Barbara Geddes. 2003. Paradigms and Sand Castles: Theory Building andResearch Design in Comparative Politics. Ann Arbor: University of Michigan Press,27–40, 87–88
PDF Paul Krugman. 1994. “The Fall and Rise of Development Economics.” InRethinking the Development Experience: Essays Provoked by the Work of Albert O.Hirschman, edited by Donald A. Schon and Lloyd Rodwin, 39–58. Washington, DCand Cambridge, MA: Brookings Institution / Lincoln Institute of Land Policy
PDF Kevin A. Clarke and David M. Primo. 2012. A Model Discipline: PoliticalScience and the Logic of Representations. New York: Oxford University Press, 52–60,78–103
PDF Fred Eidlin. 2011. “The Method of Problems versus the Method of Topics.”PS: Political Science & Politics 44 (4): 758–761
PDF Kieran Healy. 2017. “Fuck Nuance.” Sociological Theory 35 (2): 118–127
Personal Reflections
David Mayhew on the origins of Congress: The Electoral Connection (1974). Watchat https://www.youtube.com/watch?v=u3VCLfi3Bzk&feature=youtu.be.
Applications (Read One)
PDF John Zaller and Stanley Feldman. 1992. “A Simple Theory of the SurveyResponse: Answering Questions versus Revealing Preferences.” American Journal ofPolitical Science 36 (3): 579–616
PDF James D. Fearon. 1995. “Rationalist Explanations for War.” InternationalOrganization 49 (3): 379–414
Additional Resources
Stephen Van Evera. 1997. Guide to Methods for Students of Political Science. Ithaca,NY: Cornell University Press, ch. 1: “Hypotheses, Laws, and Theories”
8
Robert H. Bates et al. 1998. Analytic Narratives. Princeton: Princeton UniversityPress
John H. Aldrich, James E. Alt, and Arthur Lupia. 2009. “The EITM Approach: Ori-gins and Interpretations.” In The Oxford Handbook of Political Methodology, editedby Janet M. Box-Steffensmeier, Henry E. Brady, and David Collier. Oxford Univer-sity Press. doi:10.1093/oxfordhb/9780199286546.003.0037
Andrew T. Little and Thomas B. Pepinsky. 2016. “Simple and Formal Models inComparative Politics.” Chinese Political Science Review 1 (3): 425–447
4 September 26: Appraisal
How should arguments be appraised (i.e., tested, evaluated, falsified)? What makes forconvincing appraisal? What are various strategies of appraisal? How does “design” differfrom “analysis”? In a probabilistic world, how should we draw theoretical inferences fromdata? What are the advantages and disadvantages of alternative appraisal strategies, asillustrated by studies of the democratic peace?
Required Readings (199 pages)
Overview
Gerring, Social Science Methodology, 74–103 (ch. 4: “Analyses”)
Philosophical Foundations of Statistical Analysis
Dienes, Understanding Psychology, 55–81 (ch. 3: “Neyman, Pearson and HypothesisTesting”), 82–120 (ch. 4: “Bayes and the Probability of Hypotheses”), and 121–156(ch. 5: “Fisher and the Likelihood: The Royall Road to Evidence”)
Applications to the Democratic Peace (Read All)
PDF Zeev Maoz and Bruce Russett. 1993. “Normative and Structural Causes ofDemocratic Peace, 1946–1986.” American Political Science Review 87 (3): 624–638
PDF Susan Peterson. 1995. “How Democracies Differ: Public Opinion, State Struc-ture, and the Lessons of the Fashoda Crisis.” Security Studies 5 (1): 3–37
PDF Michael R. Tomz and Jessica L. P. Weeks. 2013. “Public Opinion and theDemocratic Peace.” American Political Science Review 107 (4): 849–865
9
Additional Resources
Paul E. Meehl. 1997. “The Problem Is Epistemology, Not Statistics: Replace Sig-nificance Tests by Confidence Intervals and Quantify Accuracy of Risky NumericalPredictions.” In What If There Were No Significance Tests?, edited by Lisa L. Har-low, Stanley A. Mulaik, and James H. Steiger, 395–425. Mahwaw, NJ: LawrenceErlbaum Associates
Donald T. Campbell and Julian Stanley. 1963. Experimental and Quasi-ExperimentalDesigns for Research. Belmont, CA: Wadsworth
Alexander L. George and Andrew Bennett. 2005. Case Studies and Theory Develop-ment in the Social Sciences. Cambridge, MA: MIT Press
5 October 3: Conceptualization and Description
This class session covers concepts (the linguistic containers political scientists use to de-scribe the social world) and descriptive arguments (answers to “what” questions). Follow-ing Gerring, these can be viewed as forms of descriptive (as distinct from causal) discovery.While conceptualization and description are sometimes neglected, they are both valuablescientific tasks in themselves and necessary preconditions for causal arguments and analy-ses. To illustrate the importance of conceptualization and description and the many issuesthey raise, we take a sustained look at various approaches to the conceptualization andmeasurement of ethnicity and other forms of identity.
Required Readings (130 pages)
Overview
Gerring, Social Science Methodology, 107–40 (ch. 5: “Concepts”) and 141–54 (ch. 6:“Descriptive Arguments”)
Classic Perspectives
PDF Giovanni Sartori. 1970. “Concept Misformation in Comparative Politics.”American Political Science Review 64 (4): 1033–1053
PDF David Collier and James E. Mahon Jr. 1993. “Conceptual ‘Stretching’ Re-visited: Adapting Categories in Comparative Analysis.” American Political ScienceReview 87 (4): 845–855
10
Applications (Read All)
PDF Jacob S. Hacker. 2004. “Privatizing Risk without Privatizing the Welfare State:The Hidden Politics of Social Policy Retrenchment in the United States.” AmericanPolitical Science Review 98 (2): 243–260
PDF Kanchan Chandra. 2005. “Ethnic Parties and Democratic Stability.” Perspec-tives on Politics 3 (2): 235–252
PDF Alisha C. Holland. 2016. “Forbearance.” American Political Science Review110 (2): 232–246
Additional Resources
Richard Locke and Kathleen Thelen. 1998. “Problems of Equivalence in ComparativePolitics: Apples and Oranges, Again.” APSA-CP: Newsletter of the APSA OrganizedSection in Comparative Politics 9 (1): 9–12
Paul Pierson. 2000. “Increasing Returns, Path Dependence, and the Study of Poli-tics.” American Political Science Review 94 (2): 251–267
Rawi Abdelal, Yoshiko M. Herrera, Alastair Iain Johnston, and Rose McDermott.“Identity as a Variable.” Chapter 1 in Measuring Identity: A Guide for Social Sci-entists, 17–32
6 October 10: Measurement
→ Introduction and theory section of research proposal due
This session moves from descriptive discovery to descriptive appraisal—that is, from theo-retical concepts and arguments to the empirical task of measurement. The readings discussprocedures for constructing operational measures of concepts and criteria for evaluatingmeasurement validity. We consider how choices regarding measurement can affect theappraisal of descriptive arguments. To illustrate the pitfalls and trade-offs of differentmeasurement strategies, we examine various approaches to conceptualizing and measuringdemocracy.
Required Readings (151 pages)
Overviews
Gerring, Social Science Methodology, 155–94 (ch. 7: “Measurements”)
11
PDF Robert Adcock and David Collier. 2001. “Measurement Validity: A SharedStandard for Qualitative and Quantitative Research.” American Political ScienceReview 95 (3): 529–546
Applications to Measuring Democracy (Read All)
PDF Mike Alvarez et al. 1996. “Classifying Political Regimes.” Studies in Compar-ative International Development 31 (2): 3–36
PDF David Collier and Steven Levitsky. 1997. “Democracy with Adjectives: Con-ceptual Innovation in Comparative Research.” World Politics 49 (3): 430–451
PDF Pamela Paxton. 2000. “Women’s Suffrage in the Measurement of Democracy:Problems of Operationalization.” Studies in Comparative International Development35 (3): 92–111
PDF Shawn Treier and Simon Jackman. 2008. “Democracy as a Latent Variable.”American Journal of Political Science 52 (1): 201–217
Additional Resources
Richard E. Nisbett and Timothy DeCamp Wilson. 1977. “Telling More Than We CanKnow: Verbal Reports on Mental Processes.” Psychological Review 84 (3): 231–259
Jack H. Hexter. 1986. “The Historical Method of Christopher Hill.” In On Historians,227–251. Cambridge, MA: Harvard
Ian S. Lustick. 1996. “History, Historiography, and Political Science: Multiple Histor-ical Records and the Problem of Selection Bias.” American Political Science Review90 (3): 605–618
Jason Seawright and David Collier. 2014. “Rival strategies of validation: Tools forevaluating measures of democracy.” Comparative Political Studies 47 (1): 111–138
7 October 17: Causation and Causal Arguments
→ Draft of grant proposal due (optional)
In this session, we transition from description to causation. We will cover both causalarguments (i.e., discovery) and causal analyses (i.e., appraisal), though we leave discussionof specific strategies of causal appraisal for subsequent sessions. We will consider alter-native definitions of causation and discuss what constitutes a well-defined (and thereforeestimable) casual effect. One of the goals of this session is to establish a framework for
12
reasoning about causation that transcends methodological divides (e.g., quantitative vs.qualitative).
Required Readings (142 pages)
Causal Arguments and Analyses
Gerring, Social Science Methodology, 197–217 (ch. 8: “Causal Arguments”) and 218–55 (ch. 9: “Causal Analyses”)
Perspectives on Causation
PDF Henry E. Brady. 2009. “Causation and Explanation in Social Science.” Chap. 10in The Oxford Handbook of Political Methodology, edited by Janet M. Box-Steffensmeier,Henry E. Brady, and David Collier, 217–270. New York: Oxford University Press,217–249 (section 9 not required)
Counterfactuals
PDF James D. Fearon. 1991. “Counterfactuals and Hypothesis Testing in PoliticalScience.” World Politics 43 (2): 169–195
Causation without Manipulation?
PDF Maya Sen and Omar Wasow. 2016. “Race as a Bundle of Sticks: Designsthat Estimate Effects of Seemingly Immutable Characteristics.” Annual Review ofPolitical Science 19 (1): 499–522
Additional Resources
David Lewis. 1973. “Causation.” Journal of Philosophy 70 (7): 556–567
Donald B. Rubin. 2008. “For Objective Causal Inference, Design Trumps Analysis.”Annals of Applied Statistics 2 (3): 808–840
William R. Shadish. 2010. “Campbell and Rubin: A Primer and Comparison of TheirApproaches to Causal Inference in Field Settings.” Psychological Methods 15 (1): 3–17
Judea Pearl. 2009. Causality: Models, Reasoning, and Inference. 2nd ed. New York:Cambridge University Press
13
8 October 24: Quantitative Analysis I—Experiments
→ NSF proposal due (official application deadline: 5pm)
This is the first in a series of sessions that discuss specific causal inference strategies and re-search designs. We begin with what are arguably the most straightforward causal designs,ones in which units are randomly (or “as if” randomly) assigned to different treatment con-ditions (i.e., different levels of the causal variable of interest). Such experiments, whethercontrolled by the researcher or implemented by “nature,” offer the most propitious set-ting for estimating average treatment effects of various kinds. The applications for thisweek occupy different points on Dunning’s “continuum of plausibility” of as-if randomness.The examples range from temporal (Campbell and Ross 1968) and geographic (Ferwerdaand Miller 2014; Kocher and Monteiro 2016) discontinuities to randomized natural (Titiu-nik 2015), field (Broockman and Kalla 2016), and survey (Hainmueller, Hangartner, andYamamoto 2015) experiments. We will discuss trade-offs between internal and externalvalidity and other issues that arise in (quasi-)experimental settings.
Required Readings (156 pages)
Overview
Gerring, Social Science Methodology, 256–90 (ch. 10: “Causal Strategies: X and Y ”)
Natural Experiments
PDF Thad Dunning. 2008. “Improving Causal Inference: Strengths and Limitationsof Natural Experiments.” Political Research Quarterly 61 (2): 282–293
Applications (Read All)
PDF Donald T. Campbell and H. Laurence Ross. 1968. “The Connecticut Crack-down on Speeding: Time-Series Data in Quasi-Experimental Analysis.” Law & SocietyReview 3 (1): 33–54
PDF Jeremy Ferwerda and Nicholas L. Miller. 2014. “Political Devolution and Re-sistance to Foreign Rule: A Natural Experiment.” American Political Science Review108 (3): 642–660
PDF Matthew A. Kocher and Nuno P. Monteiro. 2016. “Lines of Demarcation:Causation, Design-Based Inference, and Historical Research.” Perspectives on Politics14 (4): 952–975 (critique of Ferwerda and Miller 2014)
PDF Rocıo Titiunik. 2015. “Drawing Your Senator from a Jar: Term Length andLegislative Behavior.” Political Science Research and Methods 4 (2): 293–316
14
PDF David Broockman and Joshua Kalla. 2016. “Durably Reducing Transphobia:A Field Experiment on Door-to-Door Canvassing.” Science 352 (6282): 220–224
PDF Jens Hainmueller, Dominik Hangartner, and Teppei Yamamoto. 2015. “Val-idating Vignette and Conjoint Survey Experiments Against Real-World Behavior.”Proceedings of the National Academy of Sciences 112 (8): 2395–2400
9 October 31: Quantitative Analysis II—Confounders, Col-liders, Mechanisms, and Instruments
In this session we broaden our focus to consider causal designs that, in the Gerring’s words,require going “beyond X and Y ” (i.e., beyond the treatment and the outcome). Theseinclude designs that: (1) adjust for the confounding effect of common causes of X andY ; (2) examine the mediators and/or moderators of the X –Y relationship; or (3) use aninstrument to induce/isolate random variation in the causal variable of interest. We alsoconsider selection bias, which arises from inappropriate adjustment or conditioning. Fi-nally, we examine critiques of observational studies (relative to experiments) and responsesto those critiques.
Required Readings (174 pages)
Overview
Gerring, Social Science Methodology, 291–326 (ch. 11: “Causal Strategies: Beyond Xand Y ”)
Example of Selection Bias
PDF Barbara Geddes. 2003. Paradigms and Sand Castles: Theory Building andResearch Design in Comparative Politics. Ann Arbor: University of Michigan Press,89–106 (part of ch. 3: “How the Cases You Choose Affect the Answers You Get”)
Example of De-confounding
PDF Ebonya L. Washington. 2008. “Female Socialization: How Daughters AffectTheir Legislator Fathers’ Voting on Women’s Issues.” American Economic Review98 (1): 311–332
Example of Mediation/IV
PDF Robert S. Erikson and Laura Stoker. 2011. “Caught in the Draft: The Effectsof Vietnam Draft Lottery Status on Political Attitudes.” American Political Science
15
Review 105 (2): 221–237
The Debate over Experimental vs. Observational Studies
PDF Joshua D. Angrist and Jorn-Steffen Pischke. 2010. “The Credibility Revolu-tion in Empirical Economics: How Better Research Design is Taking the Con out ofEconometrics.” Journal of Economic Perspectives 24 (2): 3–30
PDF Alan S. Gerber, Donald P. Green, and Edward H. Kaplan. 2014. “The Illusionof Learning from Observational Research.” Chap. 1 in Field Experiments and TheirCritics: Essays on the Uses and Abuses of Experimentation in the Social Sciences,edited by Dawn Langan Teele, 9–32. New Haven, CT: Yale University Press
PDF Susan C. Stokes. 2014. “A Defense of Observational Research.” Chap. 2 in FieldExperiments and Their Critics: Essays on the Uses and Abuses of Experimentationin the Social Sciences, edited by Dawn Langan Teele, 33–57. New Haven, CT: YaleUniversity Press
PDF Lant Pritchett and Justin Sandefur. 2015. “Learning from Experiments whenContext Matters.” American Economic Review 105 (5): 471–475
Additional Resources
Jasjeet S. Sekhon. 2009. “Opiates for the Matches: Matching Methods for CausalInference.” Annual Review of Political Science 12:487–508
Paul R. Rosenbaum. 2010. Design of Observational Studies. New York: Springer
Jasjeet S. Sekhon and Rocıo Titiunik. 2012. “When Natural Experiments Are NeitherNatural nor Experiments.” American Political Science Review 106 (1): 35–57
John G. Bullock, Donald P. Green, and Shang E. Ha. 2010. “Yes, But What’s theMechanism? (Don’t Expect an Easy Answer).” Journal of Personality and SocialPsychology 98 (4): 550–558
Angus Deaton. 2014. “Instruments, Randomization, and Learning about Develop-ment.” Chap. 8 in Field Experiments and Their Critics: Essays on the Uses andAbuses of Experimentation in the Social Sciences, edited by Dawn Langan Teele,141–184. New Haven, CT: Yale University Press
10 November 7: Qualitative Analysis I—The Methods Wars
In the next three sessions, we turn to the analysis of non-numerical data. In this week, weread a debate that happened in the 1990s and 2000s about how political scientists should
16
draw conclusions from qualitative data. King, Keohane, and Verba (1994) is so widelyknown that it is typically referred to simply as “KKV.” KKV’s claim that qualitativeinference should follow the same principles as quantitative inference produced a numberof critical responses and spurred substantial growth in the field of qualitative politicalmethodology.
Required Readings (233 pages)
The Opening Salvo
Gary King, Robert Keohane, and Sidney Verba. 1994. Designing Social Inquiry: Sci-entific Inference in Qualitative Research. Princeton, NJ: Princeton University Press,1–33 (ch. 1) and 115–230 (ch. 4–6)
Critical Responses
PDF David Collier, Henry E. Brady, and Jason Seawright. 2010a. “Critiques, Re-sponses, and Trade-Offs: Drawing Together the Debate.” Chap. 8 in Rethinking SocialInquiry: Diverse Tools, Shared Standards, edited by Henry E. Brady and David Col-lier, 112–140. Lanham, MD: Rowman & Littlefield
PDF David Collier, Henry E. Brady, and Jason Seawright. 2010b. “Sources of Lever-age in Causal Inference: Toward an Alternative View of Methodology.” Chap. 9 inRethinking Social Inquiry: Diverse Tools, Shared Standards, edited by Henry E. Bradyand David Collier, 141–173. Lanham, MD: Rowman & Littlefield
PDF James Mahoney and Gary Goertz. 2006. “A Tale of Two Cultures: ContrastingQuantitative and Qualitative Research.” Political Analysis 14 (3): 227–249
Additional Resources
Henry E. Brady and David Collier, eds. 2010. Rethinking Social Inquiry: DiverseTools, Shared Standards. 2nd ed. Lanham, MD: Rowman & Littlefield
James Mahoney and Gary Goertz. 2012. A Tale of Two Cultures: Qualitative andQuantitative Research in the Social Sciences. Princeton, NJ: Princeton UniversityPress
Gerring, Social Science Methodology, 327–58 (chapter 12: “Varying Approaches toCausal Inference”)
John Gerring. 2017. “Qualitative Methods.” Annual Review of Political Science 20(1): 15–36
17
11 November 14: Qualitative Analysis II—Case Studies andProcess Tracing
This session, our second on qualitative methods, delves into the specifics of case studyresearch. It covers both cross-case comparison and within-case analysis, with an emphasison process tracing.
Required Readings (175 pages)
Case Selection
PDF Jason Seawright and John Gerring. 2008. “Case Selection Techniques in CaseStudy Research: A Menu of Qualitative and Quantitative Options.” Political ResearchQuarterly 61 (2): 294–308
Process Tracing
PDF David Waldner. 2015. “What Makes Process Tracing Good? Causal Mecha-nisms, Causal Inference, and the Completeness Standard in Comparative Politics.”In Process Tracing: From Metaphor to Analytic Tool, edited by Jeffrey Checkel andAndrew Bennett, 126–52. New York: Cambridge University Press
PDF Tasha Fairfield and Andrew E. Charman. 2017. “Explicit Bayesian Analysisfor Process Tracing: Guidelines, Opportunities, and Caveats.” Political Analysis 25(3): 363–380
Applications
PDF Ruth Berins Collier and David Collier. 2002. Shaping the Political Arena:Critical Junctures, the Labor Movement, and Regime Dynamics in Latin America.Notre Dame, IN: University of Notre Dame Press. Originally published in 1991, 3–23(“Overview”) and 27–39 (ch. 1: “Framework”)
PDF Diana Dumitru and Carter Johnson. 2011. “Constructing Interethnic Conflictand Cooperation: Why Some People Harmed Jews and Others Helped Them duringthe Holocaust in Romania.” World Politics 63 (1): 1–42
PDF Ana Catalano Weeks. 2018. “Why Are Gender Quota Laws Adopted by Men?The Role of Inter-and Intraparty Competition.” Comparative Political Studies 51(14): 1935–1973
18
Additional Resources
John Gerring. 2004. “What is a Case Study and What Is it Good For?” Americanpolitical science review 98 (2): 341–354
Alexander L. George and Andrew Bennett. 2005. Case Studies and Theory Develop-ment in the Social Sciences. Cambridge, MA: MIT Press
Paul Pierson. 2011. Politics in Time: History, Institutions, and Social Analysis.Princeton, NJ: Princeton University Press
James Mahoney and Kathleen Thelen, eds. 2015. Advances in Comparative-HistoricalAnalysis. Cambridge: Cambridge University Press
David Collier. 2011. “Understanding Process Tracing.” PS: Political Science andPolitics 44 (4): 823–830
Richard A. Nielsen. 2016. “Case Selection via Matching.” Sociological Methods &Research 45 (3): 569–597
12 November 21: Qualitative Analysis III—Fieldwork andInterpretation
In our final session on qualitative analysis, we will discuss fieldwork and ethnography. Wewill give particular attention to interpretive methods, which focus on how people under-stand and give meaning to the world around them. In contrast to positivism, interpretivismemphasizes the importance of inhabiting the perspective of the subjects of study and theimpossibility of studying social phenomena neutrally or objectively. Positivism and in-terpretivism are often considered incompatible opposites, but as the Nielsen and Cramerreadings suggest, many scholars find both perspectives valuable.
Required Readings (146 pages)
Perspectives
PDF Elisabeth Jean Wood. 2009. “Field Research.” In The Oxford Handbook ofComparative Politics, edited by Carles Boix and Susan C. Stokes. Oxford UniversityPress. doi:10.1093/oxfordhb/9780199566020.003.0005
PDF Timothy Pachirat. 2006. “We Call It a Grain of Sand: The Interpretive Ori-entation and a Human Social Science.” Chap. 21 in Interpretation and Method: Em-pirical Research Methods and the Interpretive Turn, 1st ed., edited by Dvora Yanowand Peregrine Schwartz-Shea, 373–379. Armonk, NY: M. E. Sharpe
19
PDF Lee Ann Fujii. 2015. “Five Stories of Accidental Ethnography: Turning Un-planned Moments in the Field into Data.” Qualitative Research 15 (4): 525–539
PDF Richard A. Nielsen. 2019. “Recite! Interpretive Fieldwork for Positivists.” InAn Unorthodox Guide to Fieldwork, edited by Peter Krause and Ora Szekely. NewYork: Columbia University Press
Applications
PDF Katherine J. Cramer. 2016. The Politics of Resentment: Rural Consciousnessin Wisconsin and the Rise of Scott Walker. Chicago: University of Chicago Press,26–44 (ch. 2: “A Method of Listening”) and 45–89 (ch. 3: “The Contours of RuralConsciousness”)
PDF Diana Fu. 2017. “Disguised Collective Action in China.” Comparative PoliticalStudies 50 (4): 499–527
Additional Resources
Dvora Yanow and Peregrine Schwartz-Shea. 2015. Interpretation and Method: Em-pirical Research Methods and the Interpretive Turn. New York: Routledge
Lisa Wedeen. 2010. “Reflections on Ethnographic Work in Political Science.” AnnualReview of Political Science 13 (1): 255–272
Richard Fenno. 1978. Homestyle. Boston: Little, Brown, 249–95 (Appendix: “Noteson Method: Participant Observation”)
November 28: NO CLASS (Thanksgiving)
13 December 5: Mixing Methods (Or Not)
In this, our final session focused explicitly on methodology, we consider the issue of whetherand how to reconcile and combine different methodological approaches. Do differentmethodological approaches share an underlying unity of logic and standards, or are theyessentially different “cultures” that can coexist only at a safe distance from one another?Can and should a given study employ multiple methods to complement each other, or isknowledge cumulation best served by methodological specialization?
20
Required Readings (170 pages)
Overview
Gerring, Social Science Methodology, 359–376 (ch. 13: “Unity and Plurality”) and379–93 (ch. 14: “Setting Standards”)
Approaches
PDF Evin S. Lieberman. 2005. “Nested Analysis as a Mixed-Method Strategy forComparative Research.” American Political Science Review 99 (3): 435–452
PDF Francesca Refsum Jensenius. 2014. “The Fieldwork of Quantitative Data Col-lection.” PS: Political Science & Politics 47 (2): 402–404
PDF Peter Lorentzen, M. Taylor Fravel, and Jack Paine. 2017. “Qualitative Investi-gation of Theoretical Models: The Value of Process Tracing.” Journal of TheoreticalPolitics 29 (3): 467–491
Critiques
PDF Amel Ahmed and Rudra Sil. 2012. “When Multi-Method Research SubvertsMethodological Pluralism—or, Why We Still Need Single-Method Research.” Per-spectives on Politics 10 (4): 935–953
PDF Scott Gelbach. 2015. “The Fallacy of Multiple Methods.” CP: Newsletter of theComparative Politics Organized Section of the American Political Science Association25 (2): 11–12. http://comparativenewsletter.com/files/archived_newsletters/newsletter_fall2015.pdf
Applications
PDF Anna Grzymala-Busse. 2015. “Weapons of the Meek: How Churches InfluencePublic Policy.” World Politics 68 (1): 1–36
PDF Daniel C. Mattingly. 2016. “Elite Capture: How Decentralization and InformalInstitutions Weaken Property Rights in China.” World Politics 68 (3): 383–412
Additional Resources
Macartan Humphreys and Alan M. Jacobs. 2015. “Mixing Methods: A BayesianApproach.” American Political Science Review 109 (4): 653–673
21
James Fearon and David Laitin. 2009. “Integrating Qualitative and QuantitativeMethods.” In The Oxford Handbook of Political Methodology, edited by Janet M. Box-Steffensmeier, Henry E. Brady, and David Collier. Oxford University Press. doi:10.1093/oxfordhb/9780199286546.003.0033
Thad Dunning. 2012. Natural Experiments in the Social Sciences: A Design-BasedApproach. New York: Cambridge University Press
Jason Seawright. 2016. Multi-method Social Science: Combining Qualitative and Quan-titative Tools. New York: Cambridge University Press
December 12: NO CLASS
→ Research design assignment due
This syllabus was last modified on August 14, 2019.
22