28
An Array of Qualitative Data Analysis Tools: A Call for Data Analysis Triangulation Nancy L. Leech University of Colorado at Denver and Health Sciences Center Anthony J. Onwuegbuzie Sam Houston State University One of the most important steps in the qualitative research process is analysis of data. The purpose of this article is to provide elements for understanding multiple types of qualitative data analysis techniques available and the importance of utilizing more than one type of analysis, thus utilizing data analysis triangulation, in order to understand phenomenon more fully for school psychology research and beyond. The authors describe seven quali- tative analysis tools: methods of constant comparison, keywords-in-context, word count, classical content analysis, domain analysis, taxonomic analysis, and componential analysis. Then, the authors outline when to use each type of analysis. In so doing, the authors use real qualitative data to help distin- guish the various types of analyses. Furthermore, flowcharts and tables are provided to help delineate when to choose each type of analysis. Finally, the role of computer-assisted software in the qualitative data-analytic process is discussed. As such, use of the analyses outlined in this article should help to promote rigor in qualitative research. Keywords: qualitative data analysis, analysis, data analysis triangulation, constant com- parision analysis In the field of psychology, since the 1960s, qualitative research ap- proaches have been used by researchers to obtain “a more naturalistic, Nancy L. Leech, PhD, is an assistant professor in the Research and Evaluation Program in the Division of Education Psychology at the University of Colorado at Denver and Health Sciences Center. Her area of research is promoting new developments and better understand- ings in applied qualitative, quantitative, and mixed methodologies. Dr. Leech has received several outstanding paper awards, including the Southwest Educational Research Association (SERA) Outstanding Paper Award in 2003 and 2004 and the Mid-South Educational Re- search Association (MSERA) Outstanding Paper Award in 2004. In 2004 and 2005, Dr. Leech was awarded the Outstanding Researcher Award from the School of Education and Human Development at the University of Colorado at Denver and Health Sciences Center. She received her PhD from Colorado State University. School Psychology Quarterly Copyright 2007 by the American Psychological Association 2007, Vol. 22, No. 4, 557–584 1045-3830/07/$12.00 DOI: 10.1037/1045-3830.22.4.557 557

An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

Embed Size (px)

Citation preview

Page 1: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

An Array of Qualitative Data Analysis Tools:A Call for Data Analysis Triangulation

Nancy L. LeechUniversity of Colorado at Denver and Health Sciences Center

Anthony J. OnwuegbuzieSam Houston State University

One of the most important steps in the qualitative research process is analysisof data. The purpose of this article is to provide elements for understandingmultiple types of qualitative data analysis techniques available and theimportance of utilizing more than one type of analysis, thus utilizing dataanalysis triangulation, in order to understand phenomenon more fully forschool psychology research and beyond. The authors describe seven quali-tative analysis tools: methods of constant comparison, keywords-in-context,word count, classical content analysis, domain analysis, taxonomic analysis,and componential analysis. Then, the authors outline when to use each typeof analysis. In so doing, the authors use real qualitative data to help distin-guish the various types of analyses. Furthermore, flowcharts and tables areprovided to help delineate when to choose each type of analysis. Finally, therole of computer-assisted software in the qualitative data-analytic process isdiscussed. As such, use of the analyses outlined in this article should help topromote rigor in qualitative research.

Keywords: qualitative data analysis, analysis, data analysis triangulation, constant com-parision analysis

In the field of psychology, since the 1960s, qualitative research ap-proaches have been used by researchers to obtain “a more naturalistic,

Nancy L. Leech, PhD, is an assistant professor in the Research and Evaluation Programin the Division of Education Psychology at the University of Colorado at Denver and HealthSciences Center. Her area of research is promoting new developments and better understand-ings in applied qualitative, quantitative, and mixed methodologies. Dr. Leech has receivedseveral outstanding paper awards, including the Southwest Educational Research Association(SERA) Outstanding Paper Award in 2003 and 2004 and the Mid-South Educational Re-search Association (MSERA) Outstanding Paper Award in 2004. In 2004 and 2005, Dr. Leechwas awarded the Outstanding Researcher Award from the School of Education and HumanDevelopment at the University of Colorado at Denver and Health Sciences Center. Shereceived her PhD from Colorado State University.

School Psychology Quarterly Copyright 2007 by the American Psychological Association2007, Vol. 22, No. 4, 557–584 1045-3830/07/$12.00 DOI: 10.1037/1045-3830.22.4.557

557

Page 2: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

contextual and holistic understanding of human beings in society” (Todd,Nerlich, & McKeown, 2004, p. 4; emphasis in original). These approacheshave focused on studying phenomena in their natural settings and strivingto make sense of or interpreting phenomena with respect to the meaningspeople bring to them (Banister, Burman, Parker, Taylor, & Tindal, 1994;Denzin & Lincoln, 2005a). Qualitative research has particular appeal forthe field of school psychology. As noted by Nastasi and Schensul (2005),qualitative research techniques are essential for

documenting the adaptations necessary for application of interventions to real-lifecontexts and for identifying core intervention components which are related todesired outcomes. Furthermore, qualitative methods can help researchers to de-scribe various manifestations of intended outcomes that may not be reflected instandardized instruments and to identify unintended positive or negative outcomesfor the individual and institution/community (p. 187).

Further, qualitative research is extremely useful for obtaining insightsinto regular or problematic experiences and the meaning attached to theseexperiences of selected individuals (e.g., biography, auto-biography, casestudy, oral history, life history, auto-ethnography) and groups (e.g., eth-nography, phenomenology, grounded theory, critical theory), which, undercertain conditions (e.g., data saturation, theoretical saturation, informa-tional redundancy), can achieve Verstehen or understanding (Bogdan &Biklen, 2003; Onwuegbuzie & Leech, 2007a). For example, a case study(i.e., formal study to understand issues intrinsic to the person, group, event,etc.; Stake, 2005) can be used to examine a construct of interest to a schoolpsychologist (e.g., autism, bipolar disorder, depression, bullying) as itmanifests itself on a particular individual or group of individuals. Anethnographic study (i.e., “process and product of describing cultural be-havior”; Schwandt, 2001, p. 44) can be used to describe and interpret thecultural behaviors, customs, and experiences of children with special needs.A phenomenological inquiry (i.e., study to describe the meaning of thelived experiences for selected individuals; Creswell, 2007) can be utilized todescribe the meaning of the lived experiences for victims of bullying.

Anthony Onwuegbuzie is professor in the Department of Educational Leadership andCounseling at Sam Houston State University. He teaches courses in qualitative research,quantitative research, and mixed methods. His research topics primarily involve disadvan-taged and under-served populations such as minorities, children living in war zones, studentswith special needs, and juvenile delinquents. He also writes extensively on methodologicaltopics. In addition to 2 books, he has had published more than 200 articles in refereed journalsand book chapters. Also, he has made more than 380 presentations and keynote addresses atregional, national, and international conferences. Presently, he serves as a co-editor ofResearch in the Schools (a nationally/internationally refereed journal) and editor of theResearch News and Comments section of Educational Researcher. He has received numerousoutstanding paper awards, including the Literati Club Award for Excellence 2003, an Inter-national Award.

Correspondence concerning this article should be addressed to Nancy L. Leech, Uni-versity of Colorado at Denver, School of Education, Campus Box 106, P.O. Box 173364,Denver, CO 80217. E-mail: [email protected]

558 Leech and Onwuegbuzie

Page 3: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

Grounded theory research (i.e., study using rigorous set of procedures toproduce substantive theory of social phenomena; Glaser & Strauss, 1967)could be employed to generate or expand a theory of hope or anotherpositive psychological construct. Regardless of the research design ormethod used, qualitative research can inform theory and model develop-ment, if it is conducted in a way that leads to insights into particularpsychological, social, and/or cultural processes and practices that existwithin a specific setting, location, time, context, event, incident, activity,and/or experience (Connolly, 1998; Onwuegbuzie & Leech, 2004).

Quantitative research is most helpful when “answering questions ofwho, where, how many, how much, and what is the relationship betweenspecific variables” (Adler, 1996, p. 5). However, quantitative research is notapt for answering why and how questions. In contrast, qualitative researchcan address such process-oriented questions. Another limitation of quan-titative research in general, and intervention research in particular, stemsfrom the “lack of documentation of the challenges encountered in imple-menting interventions designed to change or reform existing practice”(Nastasi & Schensul, 2005, p. 186). Qualitative research can play an im-portant role here. Also, by its very nature, qualitative research allowsschool psychologists to focus on cultural and contextual factors that en-hance or impede the efficacy and social/ecological validity of interventionsor programs (Nastasi & Schensul, 2005).

Despite the logical appeal of using qualitative techniques, in the fieldof school psychology, the use of qualitative research has been extremelyscant. For example, Mihalas, Powell, Onwuegbuzie, Suldo, and Daley (inpress), who examined 873 articles published in the four major schoolpsychology journals (i.e., Journal of School Psychology, Psychology in theSchools, School Psychology Quarterly, School Psychology Review) from2001 through 2005, found that only six studies published in the 5-yearperiod (i.e., 1.37% of the total number of empirical articles) representedpurely qualitative research. Recently, there has been a special issue pub-lished containing qualitative research (i.e., Nastasi & Schensul, 2005). Yet,clearly more qualitative research studies are needed in school psychologyresearch.

One reason for the historical dearth of qualitative research investiga-tions in school psychology might stem from the practical roadblocks tousing qualitative methods.1 These practical roadblocks include the fact that

1 In addition to practical roadblocks there are historical and epistemological reasons forthe lack of qualitative research in the school psychology field. In particular, psychologicalresearch has been dominated by (quantitative) experimental psychology, which has its rootsin physiology, physics, and biology, and quantitative psychology, arising from the advances inmathematics, statistics, and psychometrics (Todd et al., 2004).

559Qualitative Analysis Tools

Page 4: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

qualitative research requires expertise in designing and implementing qual-ity studies. Another reason for the scant attention paid to qualitativeresearch techniques by school psychology researchers is that the rationalefor doing so has not been made sufficiently explicit. In particular, manyschool psychology researchers might not be aware of (a) the potentialrichness of qualitative data and (b) the array of tools available for analyz-ing qualitative data.

QUALITATIVE DATA

Qualitative data have many positive features. First and foremost,typically, they provide natural occurring information that allows schoolpsychology researchers to increase their understanding of phenomena.Second, qualitative data tend to be collected in close proximity to thespecific situation such as via direct observation or interview, with theinfluence of the local context being taken into account and not beingdiscarded (Miles & Huberman, 1994). Third, qualitative data often containsome inherent “richness and holism, with strong potential for revealingcomplexity” (Miles & Huberman, 1994, p. 10), which yield thick, richdescriptions that are contextualized (Onwuegbuzie & Leech, 2004).Fourth, qualitative data often are collected over a long period, allowing forlongitudinal analyses of historical, institutional, psychological, and socialprocesses (Nastasi & Schensul, 2005). Fifth, as noted earlier, qualitativedata, which often center on people’s lived experiences, allow school psy-chology researchers to study phenomena and strive to make sense of, orinterpret, them with respect to the meanings people bring to them (Denzin& Lincoln, 2005b). Finally, qualitative data, by stemming from the partic-ular, facilitate school psychology researchers in studying how “culturalmeanings might be exchanged and negotiated as a result of intraculturalattempts to find solutions to problems” (Chambers, 2000, p. 856).

Furthermore, qualitative data can be used to strengthen quantitativeresearch designs in general and intervention research designs (i.e., exper-imental, quasi-experimental) in particular. In fact, building on the concep-tualizations of Collins, Onwuegbuzie, and Sutton (2006), four rationalesprevail for using qualitative data to improve quantitative research designs:participant enrichment, instrument fidelity, treatment integrity, and signif-icance enhancement. Participant enrichment represents the use of qualita-tive techniques for the rationale of optimizing the sample (e.g., increasingthe number of participants) in a quantitative study. For example, schoolpsychology researchers could conduct a qualitative pilot study to determinethe best ways to identify participants for their quantitative study by inter-

560 Leech and Onwuegbuzie

Page 5: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

viewing potential participants—a concept that Collins et al. (2006, p. 77)coined as “prebriefing”—to obtain qualitative information that establishestheir suitability and willingness to participate in the quantitative study.

Instrument fidelity denotes steps taken by researchers to maximize theappropriateness and/or utility of the instruments employed in a quantita-tive investigation. For instance, a school psychology researcher mightconduct a pilot qualitative study to assess the appropriateness and/or utilityof existing instruments with the objective of making necessary modifica-tions or developing and improving a new instrument. Alternatively, theresearcher could assess instrument fidelity on a continual basis and makemodifications, where needed, at one or more phases of the quantitativestudy.

Treatment integrity represents using qualitative techniques to as-sess the fidelity of interventions, treatments, or programs. In order foran intervention or program to possess integrity, it should be imple-mented as intended—that is, it should be implemented in a manner thatis consistent with the underlying theory and principles guiding theresearch design and also reflect the contextual processes that affectprogram delivery (Gresham, MacMillan, Beebe-Frankenberger, &Bocian, 2000; Onwuegbuzie, 2003). Qualitative assessment of treatmentintegrity could involve the use of techniques such as interviews, focusgroups, observations, and documents. As noted by Collins et al., the useof qualitative techniques for assessing treatment integrity would in-crease insights into treatment integrity and most likely lead to theidentification of implementation bias (i.e., the extent to which imple-mentation of intervention deviated from the protocol; Onwuegbuzie,2003). The more information that is obtained about the intervention atvarious stages of the investigation, the better position the school psy-chology researcher will be in to put quantitative findings in their ap-propriate context.

Significance enhancement involves the use of qualitative techniques inorder to maximize researchers’ interpretations of quantitative data. Aschool psychology researcher can use qualitative data to enhance statisticalanalyses. Moreover, using quantitative and qualitative data analysis tech-niques side-by-side typically enhances the interpretation of significant find-ings in research (Collins et al., 2006; Onwuegbuzie & Leech, 2004, 2006).

Thus, in addition to conducting qualitative studies in their own right,qualitative techniques can be used by school psychology researchers tostrengthen the design of quantitative studies. The latter application wouldyield what is commonly called “mixed methods research” (Johnson &Onwuegbuzie, 2004; Tashakkori & Teddlie, 2003). As noted by Kratochwilland Stoiber (2000), the “interweaving of quantitative and qualitative re-search methodologies so that the most accurate and authentic picture of

561Qualitative Analysis Tools

Page 6: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

the knowledge bases and skills associated with change processes is avail-able” (p. 600).

QUALITATIVE DATA ANALYSIS

One of the most important steps in the research process is analysis ofdata. Research from the quantitative paradigm has multiple types of anal-yses, commonly known as statistics. Virtually all graduate students in thesocial and behavioral sciences, including the field of school psychology, arerequired to take at least one statistics course as a necessary part of theirdegree programs (Onwuegbuzie & Wilson, 2003) in which they learn bothdescriptive and inferential statistical techniques. These statistics courseshave helped to sensitize students about the importance of the data analysisprocess in quantitative studies.

In qualitative research, discussion of analysis clearly is not as commonas is the case for quantitative research. In fact, many schools of educationoffer only one qualitative research course (Leech & Goodwin, 2004), andthe one course offered commonly does not include much informationabout data analysis. With such little focus on analysis, we believe manyqualitative researchers believe that there is only one way to analyze qual-itative data—through the method of constant comparative or constantcomparison analysis (Glaser & Strauss, 1967). For example, recently, in aninformal poll of school of education faculty at a large university onequestion was, “how can qualitative data be analyzed?” More than 80% ofthe participants responded with “constant comparison analysis” (Leech,2004). We contend that using a one-size-fits-all approach (i.e., constantcomparison analysis) to analyzing qualitative data sometimes will lead tointerpretations that are not consistent with the underlying data—therebyaffecting legitimation via components such as interpretive validity andtheoretical validity (Maxwell, 1992, 2005).

One way this incorrect assumption that constant comparative analysisis the only available qualitative data analysis tool is promoted is throughtextbooks. There are many texts available to understand the discipline ofqualitative research (e.g., Eisner, 1998; Glesne & Peshkin, 1992) and howto write qualitative research reports (e.g., Emerson, Fretz, & Shaw, 1995;Richardson & St. Pierre, 2005; Wolcott, 2001). Yet, the majority of avail-able texts for qualitative research include at most only one chapter, if any,on data analysis (e.g., Berg, 2004; Bogdan & Biklen, 2003; Creswell, 2007;Schram, 2003; Shank, 2002).

In particular, in the latest edition of the Handbook of QualitativeResearch (Denzin & Lincoln, 2005a), one of the leading textbooks used in

562 Leech and Onwuegbuzie

Page 7: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

qualitative research courses in the United States, of the 44 chapters con-tained, only two chapters deal explicitly with qualitative data analysistechniques (i.e., Chapter 32 and 34). Moreover, although well written,these chapters are very conceptual in nature; thus, the authors do not showreaders how to conduct the analyses that they discuss. This lack of presen-tation of applied qualitative data analysis strategies in such a high-profilepublication prevails despite the fact that the editors of the book declare thefollowing:

Qualitative researchers must remain faithful to this indigenous organization anddeploy analytic strategies that are fitted to it. We need rigorous work that payssystematic attention to the systematic relations among the interaction order, ordersof talk, representational orders, and the organized properties of material cul-ture. . ..We need more principled and disciplined ways of accounting for the worldand its organization. (pp. 646–647 [emphasis added])

Many other leading qualitative research textbooks (e.g., Lincoln & Guba,1985) also do not provide explicit details as to how to analyze qualitativedata.

One text, Miles and Huberman (1994), does focus on qualitative dataanalyses. Since then, several more textbooks dealing with qualitative dataanalyses have been published (Coffey & Atkinson, 1996; Dey, 1993; Phillips &Jorgensen, 2002; Silverman, 2001). However, many of these textbooks focuson one data analysis technique (e.g., discourse analysis; Phillips & Jorgensen,2002). Thus, these books do not provide a comprehensive treatment of qual-itative data analysis techniques; therefore, several textbooks or supplementalinformation in lectures have to be included by qualitative research instructorsin order for their students to be exposed to a repertoire of analytical tools.

As such, we believe many researchers from the field of school psychologyand elsewhere do not realize that there are many tools available for analyzingqualitative data. Indeed, Leech and Onwuegbuzie (2005) identified 21 quali-tative data analysis techniques. The purpose of this article is to present sevendifferent data analyses techniques, illustrate the use of each technique with thesame data set, and demonstrate how utilizing more than one type of analysiscan increase understanding of the data. More specifically, we describe thefollowing seven qualitative analysis tools: method of constant comparison,keywords-in-context, word count, classical content analysis, domain analysis,taxonomic analysis, and componential analysis. These techniques were se-lected because they represent the earliest formalized qualitative data analysistechniques, being conceptualized either in the 1960s (e.g., method of constantcomparison; Glaser & Strauss, 1967) or 1970s (e.g., domain analysis, taxo-nomic analysis, componential analysis; Spradley, 1979), and they represent themost commonly used analysis techniques. We outline when to use each type ofanalysis. In so doing, we use real qualitative data to help distinguish the varioustypes of analyses. Finally, we discuss the role of computer-assisted software in

563Qualitative Analysis Tools

Page 8: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

the qualitative data-analytic process. Thus, our outline of qualitative dataanalysis techniques should help to promote rigor in qualitative research.

DESCRIPTIONS AND EXAMPLES OF A VARIETY OF THEAVAILABLE ANALYSIS TOOLS

As eloquently stated by Hatch (2002):

Data analysis is a systematic search for meaning. It is a way to process quali-tative data so that what has been learned can be communicated to others.Analysis means organizing and interrogating data in ways that allow researchersto see patterns, identify themes, discover relationships, develop explanations,make interpretations, mount critiques, or generate theories. It often involvessynthesis, evaluation, interpretation, categorization, hypothesizing, comparison,and pattern finding. It always involves what Wolcott calls “mindwork”. . .Re-searchers always engage their own intellectual capacities to make sense ofqualitative data. (pp. 148)

A crucial phrase in the above quotation is “data analysis is a systematicsearch for meaning.” Such a phrase might give school psychology research-ers with little or no experience with qualitative data analysis the impressionthat analyzing qualitative data represents a daunting task. However, thereare multiple qualitative data analysis tools available for school psychologyresearchers that facilitate a systematic search for meaning. In fact, there aremany more than could be included in this article. To assist in understandingeach type of analysis presented (i.e., method of constant comparison, key-words-in-context, word count, classical content analysis, domain analysis, tax-onomic analysis, componential analysis), a brief excerpt of data will be ana-lyzed using each tool. It is important to note that there are many types of dataavailable for qualitative researchers, including interview data, survey data,observational data, personal journals, diaries, permanent records, transcrip-tion of meetings, and photographs. This list is only a beginning. All of the typesof analyses discussed in this article can be utilized with these types of data.Also, depending on the research question, all of these analyses can be utilizedwith an array of qualitative research designs, including the following: ethnog-raphy, ethnomethodology, case study, participant observation, field researchor field study, naturalistic study, phenomenological study, ecological descrip-tive study, descriptive study, symbolic interactionist study, microethnography,interpretive research, action research, historiography, literary criticism, andgrounded theory.

In the following sections, each method is described and an exampleof its use with real data is presented. The data were collected in a studyby Hess, Kozleski, and Molina (2006). The purpose of the study was toinvestigate parents’ perceptions of special education programs in order

564 Leech and Onwuegbuzie

Page 9: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

to develop a better understanding of inclusion. The data presented werecollected through a focus group with parents of children with disabili-ties.

Constant Comparison Analysis

Constant comparison analysis likely is the most commonly used type ofanalysis for qualitative data. Some authors use the term “coding” whenreferring to this type of analysis (Miles & Huberman, 1994; Ryan &Bernard, 2000). Glaser and Strauss (1967), the fathers of grounded theory,created the method of constant comparison analysis.

When a researcher is interested in utilizing an entire dataset to identifyunderlying themes presented through the data, a constant comparisonanalysis can be helpful. Constant comparison can be undertaken deduc-tively (e.g., codes are identified prior to analysis and then looked for in thedata), inductively (e.g., codes emerge from the data), or abductively (i.e.,codes emerge iteratively).

To perform a constant comparison analysis, the researcher first readsthrough the entire set of data (this also could be a subset of the data). Afterdoing so, the researcher chunks the data into smaller meaningful parts.Then, the researcher labels each chunk with a descriptive title or a “code.”The researcher takes pains to compare each new chunk of data withprevious codes, so similar chunks will be labeled with the same code. Afterall the data have been coded, the codes are grouped by similarity, and atheme is identified and documented based on each grouping.

Constant comparative analysis can be used throughout a researchstudy. One way of using the codes is to go back out into the field andundertake member checking (Merriam, 1998) with the participants byasking if the themes, arguments, or assertions developed from the codesare accurately describing their statements (Janesick, 2000; Merriam, 1998),leading to descriptive validity (Maxwell, 1992, 1995). Also, it is helpful forparticipants to read the definition and explanation of the themes generatedfrom the data in order to assess the accuracy.

Constant comparison analysis originally was developed to analyze datathat were collected over a series of rounds. In particular, these rounds ofdata analyses led to theoretical sampling, which involves the sampling ofadditional people, groups, events, incidents, activities, documents, and thelike, in order to develop emergent themes, to assess the adequacy, rele-vance, and meaningfulness of themes, to refine ideas, and to identifyconceptual boundaries (Charmaz, 2000). However, constant comparisonanalysis since has been modified to be used to analyze data collected in oneround (e.g., single round of interviews).

565Qualitative Analysis Tools

Page 10: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

An Example of a Constant Comparison Analysis

Below is a brief excerpt of data and how one researcher analyzed itusing constant comparison analysis. The research question was, “What isthe parent’s perception of the IEP meeting?”

In order to perform the constant comparison analysis, first, the re-searcher reads through the data. Next, the researcher underlines chunks, orphrases of the data (we have underlined the chunks in this excerpt). InTable 1, each chunk is assigned a code. Before a code is applied, the earliercodes are checked to see if there is an existing similar code (this is thecomparison part of the analysis).

It was really difficult for me to sit through IEP meetings and different people wouldstart talking speech jibberish, different people would say things, and I would sitthere and I would really try to focus on what’s going on. But I would take that paperhome, and I’d look at it and I’d be thinking, what in the world just happened? Ittook me pretty, several years, before I realized, I am his advocate. I have to speakup and say, okay, wait a minute, slow down, what does that mean, what did you say?

Next, the codes are combined and themes developed. For these data, thethemes might be “Complicated IEP Papers” and “Child Advocate.”

Keywords-in-Context

Keywords-in-context (KWIC; Fielding & Lee, 1998) is a data analysismethod that reveals how respondents use words in context by comparingwords that appear before and after “key words.” This type of analysisidentifies how one uses the word(s) in context, which some qualitativeresearchers refer to as an analysis of the culture of the use of the word(Fielding & Lee). KWIC is a helpful analysis to utilize when there are

Table 1. Emergent Codes From Constant Comparison Analysis

Chunks Code for each chunk

Really difficult for me to sit through IEP meetings Difficult to sitDifferent people Different peopleStart talking speech jibberish Talk jibberishDifferent people would say things Different peopleI would sit there Difficult to sitI would really try to focus on what’s going on Try to focusI would take that paper home PaperI’d look at it and I’d be thinking PaperWhat in the world just happened What happened?Took me pretty, several years Took yearsI am his advocate AdvocateI have to speak up AdvocateSay, okay, wait a minute, slow down, what does that

mean, what did you sayAdvocate

566 Leech and Onwuegbuzie

Page 11: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

specific words that are of interest to the researcher or when the data appearto be less rich in information. Many times, with data that seem uninterest-ing, KWIC can help identify underlying connections that the participantwas implying through her/his speech. A weakness of KWIC is losing sightof the context of the words. If the words collected around the keyword areinsufficient, the phrase can lose its meaning.

To perform a keywords-in-context analysis, the researcher readsthrough the data and identifies keywords, namely, words that are usedeither frequently or in an unusual manner. Then, the researcher lists thewords that come before and after the keyword. At times, there can bedifficulty in deciding what keyword to use in a KWIC analysis. Keywordsshould be chosen through previous research or theory (i.e., a priori), orthrough frequency of use throughout a data set (i.e., a posteriori). Likeconstant comparative analysis, keywords-in-context can be used through-out a research study.

An Example of a Keywords-in-Context Analysis

Utilizing the same data as above, a keywords-in-context analysis ispresented below. The research question was, “What stands out for theparent in IEP meetings?”

It was really difficult for me to sit through IEP meetings and different people wouldstart talking speech jibberish, different people would say things, and I would sitthere and I would really try to focus on what’s going on. But I would take that paperhome, and I’d look at it and I’d be thinking, what in the world just happened? Ittook me pretty, several years, before I realized, I am his advocate. I have to speakup and say, okay, wait a minute, slow down, what does that mean, what did you say?

From this small amount of data, the researcher found “different” to be akeyword. The researcher read through the data, underlined the word“different” and then transferred this information to Table 2. This tableincludes the actual keyword, its context, and its interpretations. The overalldescription from this might be “In IEP meetings, different people talk,making understanding difficult.”

Table 2. Analysis With Keywords-in-Context

Keyword-in-context Interpretation

IEP meetings and different people wouldstart talking speech jibberish

Different people talking jibberish

Different people would say things Different people talking

567Qualitative Analysis Tools

Page 12: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

Word Count

Word counts are based on the belief that all people have distinctivevocabulary and word usage patterns. These patterns become what are oftencalled “linguistic fingerprints” (Pennebaker, Mehl, & Niederhoffer, 2003, p.568). One way to understand the perspective of the person is to count thewords used. The underlying assumption with word counts is that more impor-tant and significant words for the person will be used more often (Carley,1993). Thus, one method of using this form of analysis is to count the numberof times a specific word is used. Usually, word counts are used when the entiredata set has been collected, but it can also be used throughout the researchprocess.

According to Sechrest and Sidani (1995, p. 79), “qualitative re-searchers regularly use terms such as ‘many,’ ‘most,’ ‘frequently,’ ‘sev-eral,’ ‘never,’ and so on. These terms are fundamentally quantitative.”Thus, qualitative researchers can obtain more meaning by obtainingcounts of words in addition to their narrative descriptions (Sandelowski,2001). As noted by Miles and Huberman (1994), there are at least threereasons for counting themes: (a) to identify patterns more easily, (b) toverify a hypothesis, and (c) to maintain analytic integrity. Further, bycounting words, school psychology researchers are able to leave an audittrail, which are recommended by many qualitative researchers as amethod of evaluating legitimation or increasing legitimation, or both(Halpern, 1983; Lincoln & Guba, 1985; Onwuegbuzie & Leech, 2007b).As such, counting can help to improve the rigor of an analysis. Countingwords also can enhance verhesten by preventing the researcher fromoverweighting or underweighting the emergent themes (Sandelowski,2001).

There are weaknesses to word count. One drawback is that word countcan decontextualize the word to a point where it is not understandable. Forexample, the word “I” does not increase the researcher’s understanding ofIEP meetings without the context around the word. Another weakness isthat word count can be misleading: participants do not need to use a wordfrequently to portray important concepts.

Word count is especially useful with focus groups; it can helpidentify participants who spoke the most and who spoke the least. If aresearcher counts the use of all the words and finds the most frequentlyused words, these words can be helpful in understanding what is im-portant to the participant. Furthermore, these words can be utilized inKWIC analysis.

568 Leech and Onwuegbuzie

Page 13: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

An Example of Word Count

The above data excerpt used to illustrate KWIC was typed intoMicrosoft Word, and the word count command was utilized to find thatthere are 96 words. Utilizing word count to understand the importance ofa single word, the researcher could count the number of times the word“difficult” is used. The researcher utilized the research question, “Howfrequently does the parent use the word ‘difficult’ in a given passage?” and“Are there other frequently used words that would help in understandingthe parent’s perceptions?” In this example, the word “difficult” is used ontwo occasions, which represents a usage rate of approximately 2%. Theword “I” was used on five occasions, representing 5% usage rate, and theword “say” was used on two occasions, representing a 2% usage rate.However, it could be argued that of these three words (i.e., “difficult,” “I,”“say”), only the word “difficult” adds significant meaning.

Classical Content Analysis

Classical content analysis is similar to constant comparison analysisand also is used frequently in qualitative research; the main difference isthat instead of creating themes, the researcher counts the number of timeseach code is utilized.2 This type of analysis is helpful to use when there aremany codes; it can identify which codes are used most and which might bethe most important concepts for the interviewee. Typically, the codes areproduced deductively and then can be either included as descriptive infor-mation about the data, can be analyzed using quantitative procedures(Kelle, 1996), or both (Onwuegbuzie & Teddlie, 2003). Classical contentanalysis can be used throughout the research process.

An Example of Classical Content Analysis

The data excerpt used for the previous analyses was analyzed usingclassical content analysis. The codes created from constant comparisonanalysis were utilized (cf. Table 1). The researcher wanted to answer theresearch question, “What concepts (represented through codes) are dis-cussed most?” As can be seen from Table 3, it is seen that “advocate” was

2 Classical content analysis is not the only type of content analysis. Other types ofcontent analyses include latent content analysis and manifest content analysis. For a discus-sion of these forms of content analyses, see Leech and Onwuegbuzie (2005).

569Qualitative Analysis Tools

Page 14: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

used most (3 times), then followed by “difficult to sit,” “different people,”and “paper” (each code used 2 times), and all the other codes were usedonly once.

Domain Analysis

Domain analysis represents a search for the larger units of culturalknowledge, which Spradley (1979) called domains. This method of analysisstems from the belief that symbols are an important way of communicatingcultural meaning. The definition of a symbol is “an act, sound, or objecthaving cultural significance and the capacity to excite or objectify a re-sponse” (Merriam Webster, 2005, ¶ 1). According to Spradley (1979), allsymbols have three elements: (a) the symbol itself (i.e., cover term), (b)one or more referents (i.e., to what the symbol refers; included term), and(c) a relationship between the symbol and the referent (i.e., semanticrelationship).

Domain analysis utilizes semantic relationships to help uncover do-mains. Domains are created from (a) cover terms (concepts; Y), (b)included terms (referents; X), and (c) a semantic relationship between thecover term (Y) and the included terms (X). Casagrande and Hale (1967)identified 13 types of semantic relationships. Building on Casagrande andHale’s (1967) conceptualization, Spradley (1979) proposed nine semanticrelationships as being the most useful for conducting an analysis of seman-tic domains. Casagrande and Hale’s (1967) and Spradley’s (1979) types andthe relationships of X and Y are listed in Table 4 and Table 5, respectively.These relationships are the foundation of domain analysis.

As with constant comparison analysis wherein the researcher is label-ing chunks of data with codes, in domain analysis the researcher is labelingthe domains. As conceptualized by Spradley (1979), domain analysis in-volves a six-step process: (1) select a single semantic relationship (re-peated), (2) prepare a domain analysis worksheet (repeated), (3) select a

Table 3. Results From a Classical Content Analysis

Code Number of times used

Difficult to sit 2Different people 2Talk jibberish 1Try to focus 1Paper 2What happened? 1Took years 1Advocate 3

570 Leech and Onwuegbuzie

Page 15: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

sample of informant statements (repeated), (4) search for possible coverterms and included terms that fit the semantic relationship (repeated), (5)formulate structural questions for each domain (repeated); and (6) make alist of all hypothesized domains. Because the result of domain analysis is tocreate further structural questions for the participant, this tool for analysisis best used during the data gathering phase.

Some researchers believe that domain analysis should be reserved forpure ethnographic studies, as was suggested by the founder of domainanalysis (Spradley, 1979). However, we believe the beneficial aspect ofdomain analysis is that it gives an alternative lens with which to understanddata, and it helps in creating future questions for participants. Domainanalysis should be used when researchers are interested in understandingrelationships among concepts.

Table 4. Casagrande and Hale’s (1967) Types of Relationships for Domain Analysis

Type Relationship of X and Y

Attributive X defined with respect to one or more attributes of YContingency X is defined as with relation to an antecedent or concomitant of YFunction X is defined as the means of effecting YSpatial X is oriented spatially with respect to YOperational X is defined with respect to an action Y of which it is a goal or recipientComparison X is defined in terms of its similarity or contrast with YExemplification X is defined by citing an appropriate co-occurrent YClass inclusion X is defined with respect to its membership in a hierarchical class YSynonymy X is defined as an equivalent to YAntonymy X is defined as the negation of YProvenience X is defined with respect to its source, YGrading X is defined with respect to its placement in a series or spectrum that

also includes YCircularity X is defined as X

Table 5. Spradley’s (1979) Types of Relationships for Domain Analysis

Type Relationship of X and Y

Strict inclusion X is a kind of YSpatial X is a place in Y, X is a part of YCause-effect X is a result/cause of YRationale X is a result/cause of YLocation for action X is a place for doing YFunction X is a place for doing YMeans-end X is a way to do YSequence X is a step (stage) in YAttribution X is an attribute of Y

Note. From the Ethnographic Interview 1st edition by Spradley (1979). Reprinted withpermission of Wadsworth, a division of Thomas Learning: www.thomsonrights.ccm. Fax:800–730–2215.

571Qualitative Analysis Tools

Page 16: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

An Example of Domain Analysis

Below is the data excerpt analyzed with domain analysis. The researchquestion was “What characteristics of IEP meetings are presented?”

It was really difficult for me to sit through IEP meetings and different people wouldstart talking speech jibberish, different people would say things, and I would sitthere and I would really try to focus on what’s going on. But I would take that paperhome, and I’d look at it and I’d be thinking, what in the world just happened? Ittook me pretty, several years, before I realized, I am his advocate. I have to speakup and say, okay, wait a minute, slow down, what does that mean, what did you say?

Using domain analysis the following terms and relationship were identified:

(a) Possible cover term—IEP meeting

(b) Possible included terms—difficult, different people, speech jibberish

(c) Possible semantic relationships—X is an attribute (characteristic)of Y

For the semantic relationship above, “X” is the included term (e.g.,difficult) and “Y” is the cover term of an IEP meeting. From these termsand relationships, the researcher can construct questions—for example,“How are IEP meetings difficult?” or “What would make the speechjibberish more understandable?”

Taxonomic Analysis

Taxonomic analysis is the second step after domain analysis. Domainanalysis can be undertaken alone, or it can be utilized further with taxo-nomic analysis. Taxonomic analysis helps the researcher to understand howparticipants are using specific words. School psychology researchers shouldnot assume that words participants use have the same meaning and impli-cations. Once the domains have been identified, in taxonomic analysis, onedomain is chosen and is placed into a taxonomy. A taxonomy is defined bySpradley (1979) as a “classification system” that inventories the domainsinto a flowchart or diagram to help the researcher understand the relation-ships among the domains. Similar to a domain analysis, a taxonomicanalysis is a set of categories organized on the basis of a single semanticrelationship. However, a taxonomic analysis differs from a domain analysisinasmuch as a taxonomy shows the relationships among all the terms in adomain. More specifically, a taxonomic analysis reveals subsets of termsand the way these subsets are related to the domain as a whole(Spradley). According to Spradley, a taxonomic analysis involves the fol-

572 Leech and Onwuegbuzie

Page 17: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

lowing eight steps: (1) select a domain for taxonomic analysis; (2) identifythe appropriate substitution frame for analysis (a substitution frame [e.g.,“is a kind of”] is similar to a semantic relationship, although it differs inthat it helps to differentiate the included terms into subgroups); (3) searchfor possible subsets among the included terms; (4) search for larger, moreinclusive domains that might include as a subset the one you are analyzing;(5) construct a tentative taxonomy; (6) formulate structural questions toverify taxonomic relationships; (7) conduct additional structural interviews;and (8) construct a completed taxonomy.

An Example of Taxonomic Analysis

Using the domain analysis results from above, the following taxonomicanalysis was completed. The research question was “How are IEP meetingsdescribed?” Thus, the substitution frame identified by the researcher ispresented in Figure 1 and was “describes.”

Figure 1 shows the following substitutions:

Difficult describes IEP meetings

Different people describes IEP meetings

Speech jibberish describes IEP meetings

From these substitutions, the researcher can create structural questionsto ask the participants, “What are other ways of describing an IEP meet-ing?” and “How many different people are involved?” After these ques-tions are answered, the researcher can refine the taxonomy and use it in thewrite up to help the reader understand the data.

Figure 1. A taxonomic representation of the data.

573Qualitative Analysis Tools

Page 18: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

Componential Analysis

Componential analysis is another step that can be undertaken after do-mains are created. Componential analysis is a “systematic search for attributes(components of meaning) associated with cultural symbols” (Spradley, 1979,p. 174). Componential analysis is used when a researcher is trying to uncoverrelationships between words. This analysis is used to discover the differencesbetween the subcomponents of domains, with the goal being to “map asaccurately as possible the psychological reality of our informant’s culturalknowledge” (p. 176), or in other words, to find and present the differences.Matrices and/or tables (cf. Table 6) are useful in performing a componentialanalysis. Usually these have at least two dimensions (a) “the contrast set” and(b) “dimensions of contrast.” The dimensions of contrast are questions for-mulated by the researcher to help differentiate the contrast set. Each questionneeds to be answered either “yes” or “no.” According to Spradley , a com-ponential analysis involves the following eight steps: (1) select a contrast set foranalysis, (2) inventory all contrasts previously discovered, (3) prepare a par-adigm worksheet, (4) identify dimensions of contrast which have binary val-ues, (5) combine closely related dimensions of contrast into ones that havemultiple values, (6) prepare contrast questions to elicit missing attributes andnew dimensions of contrast, (7) conduct an interview to elicit needed data, and(8) prepare a completed paradigm.

An Example of Componential Analysis

As shown in Table 6, the included terms identified through domainanalysis were utilized as the contrast set. The dimensions of contrast wereformulated through information gathered from the participant and reflec-tion of the researcher. The research question was “How are IEP meetingshelpful for parents?”

As shown in Table 6 the aspect “different people” is unlike the otheraspects in that it might help the parent understand the process more and

Table 6. Componential Analysis Table

Contrast set

Dimensions of contrast

Does it help the parentfeel the needs of the

child will be met?

Does it help the parentunderstand the

process?

Does it help the parentparticipate in the

meeting?

Difficult No No NoDifferent people Maybe Maybe NoSpeech jibberish Maybe No No

574 Leech and Onwuegbuzie

Page 19: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

feel more involved. The term “difficult” did not fit with any of the dimen-sions of contrasts, which indicates that the researcher should consider otherdimensions. From this analysis, the researcher would create structuralquestions to fill in gaps in understanding the contrast set. One questionmight be “How can IEP meetings be more helpful?”

CHOOSING AMONG THE SEVEN QUALITATIVE DATAANALYSIS TOOLS

The previous sections have outlined seven powerful procedures foranalyzing qualitative data. In order to help school psychology researchersto select the most appropriate procedures, we will compare and contrasteach of these techniques. Increasing rigor and trustworthiness of the find-ings from qualitative data is accomplished through three different proce-dures: (a) using more than one type of analysis, (b) assessing interraterreliability, and (c) member checking. As we have stated, using more thanone type of analysis can strengthen the rigor and trustworthiness of thefindings via methodological triangulation (i.e., consistency among qualita-tive analytic procedures; Denzin, 1978). Furthermore, having multipleresearchers code the data and then assessing interrater reliability canstrengthen the trustworthiness of the findings via investigator triangulation(i.e., consistency among researchers; Denzin, 1978).3 To evaluate interraterreliability, the researcher can begin by rating how close the themes arebetween two raters. For example, the following scale can be used: 0 (noagreement), 1 (a little agreement), 2 (much agreement), 3 (total agreement).Then, the researcher would compute Kappa (Cohen, 1960) or anotherappropriate statistic to calculate the overall agreement between the tworaters for all of the themes. For three or more raters, the multirater Kappameasure can be used (Siegel & Castellan, 1988). Some qualitative softwarewill allow the computation of interrater reliability—for example N6 (QSRInternational Pty Ltd, 2006). Finally, member checking, which involves theresearcher showing the findings and/or interpretations to the participantsfor assessment of accuracy, can increase the rigor and trustworthiness ofthe findings via what we term as descriptive triangulation (i.e., consistencybetween researcher and participant[s]). Member checking can be under-

3 It should be noted that some qualitative researchers strongly disagree on philosophicalgrounds with the practice of using a statistics to assess interrater reliability. Rather, theybelieve that discussion among analysts represents the best method to determine agreement.These researchers argue that having the perspectives of multiple analysts does not necessarilyincrease the accuracy of interpretation. Further, they contend that an analyst with a differentperspective may have the greatest insightful of all.

575Qualitative Analysis Tools

Page 20: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

taken with all of seven analyses and should be used to enhance descriptivevalidity, interpretive validity, and/or theoretical validity (Maxwell, 1992,2005).

Constant comparison analysis is a method of choice when the re-searcher wants to answer general, or overarching, questions of the data.For example, if a researcher wants to understand a school psychologist’sperceptions of work, using constant comparison analysis would be appro-priate. Word count is particularly useful if the researcher can assume thatfrequency of words, categories, or themes provides a good indication ofmeaningfulness. If this assumption cannot be made, then the school psy-chology researcher should consider using one of the other data analysistools. KWIC is an appropriate tool to use when the researcher is interestedin capturing the actual words used by the participant (i.e., in vivo codes).This is particularly useful when analyzing short responses to unstructuredor structured questions. However, this analysis still can be utilized toanalyze large unstructured text. Similarly, classical content analysis isuseful when codes have been generated and a sense of how often each codehas been used would be helpful. Generally, this is most useful when theresearcher is interested in the frequency of the themes.

Domain analysis is helpful to use when the researcher has the optionof interviewing the same participant again. With domain analysis, struc-tural questions are generated that can help identify what aspects of theparticipant’s story the researcher needs further information. Taxonomicanalysis, like domain analysis, is helpful to increase understanding ofparticipants’ speech or language. It is best used after domain analysis hasbeen completed, and further analysis would assist the researcher in orga-nizing the included terms used by the participant(s). Finally, componentialanalysis is commonly used after domain and taxonomic analysis. Thisanalysis is useful for seeing comparisons in the data and identifying placeswhere the researcher needs further clarification from the participant(s).

All seven data analysis tools are similar in that they involve the threeconcurrent flows of activity: data reduction, data display, and conclusiondrawing verification (Miles & Huberman, 1994). However, some of thesetechniques (e.g., domain analysis) also can lead to data expansion byleading to additional questions being posed and addressed. Table 7 linksgeneral research question formats with suggested analyses. It should benoted that any type of analysis can be undertaken, regardless of theresearch questions; thus, these are only suggestions.

576 Leech and Onwuegbuzie

Page 21: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

USING COMPUTER-ASSISTED QUALITATIVE DATAANALYSIS SOFTWARE

Computer-assisted data qualitative data analysis software (CAQDAS)programs (e.g., NVIVO, version 7.0; QSR International Pty Ltd., 2006) areavailable that can be used to undertake each of the qualitative dataanalyses outlined above. Unfortunately, it is beyond of the present articleto provide specifics as to how computer qualitative data analysis softwarecan be utilized. We recommend that, when possible, school psychologyresearchers take workshops that provide training in using computer qual-itative data analysis software.4

For more than a decade, computer software for qualitative data hasincreased in availability and in complexity (Ulin, Robinson, & Tolley,2005). Unfortunately, as noted by Bourdon (2002, ¶ 1), “their use is stillrelatively limited compared with their full potential. The proportion ofresearch projects relying on them appears to be steadily increasing but veryfew researchers are known to fully exploit their capabilities.” Yet, usingcomputer-assisted data qualitative data analysis software (CAQDAS) al-

4 The University of Surrey’s CAQDAS Networking Project maintains a website with links tomany software developers and workshops at http://www.soc.surrey.ac.uk/caqdas/. This organiza-tion has no commercial links to any software developer or supplier. For information about the fullrange of qualitative software available, and the tools that different software packages provide, weadvise you to go directly to the software developers’ Web sites. Also, for an excellent comparisonof the leading qualitative software (i.e., ATLAS.ti V5, HyperRESEARCH V.2.06, MAXqda V2,N6, NVivo 2, QDAMiner, Qualrus, Transana), see Lewins and Silver (2005).

Table 7. Types of Research Questions (Adapted From Miles and Huberman, 1994)

General research question format

Dimensions of contrast

Data analysis techniques

CCA WC KWIC CC DA TA CA

Does X cause Y? X XDoes X cause more of Y than Z

causes of Y?X X X

What is X? X X X X X X XIs X located where Y is lowest? X X X XWhat does “Y” mean? X X X XWhy does S support X? X X X X X XWhat makes W good? X X X X X XDoes T value X? X X X X XIs X more cost-effective than Z? X X X XHow are U maximized and V

minimized simultaneously?X X

Note. CCA � Classical Content Analysis; WC � Word Count; KWIC � Keywords inContext; CC � Constant Comparison; DA � Domain Analysis; TA � Taxonomic Analysis;CA � Componential Analysis.

577Qualitative Analysis Tools

Page 22: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

lows the researcher to take analysis of qualitative data significantly furtherthan is possible undertaking this analysis by hand (Bazeley 2006, Fielding& Lee, 1998; Kelle, 1996; Tesch, 1990; Weitzman & Miles, 1995). Inparticular, CAQDAS programs provide an excellent tool for recording,storing, indexing, and sorting the voluminous data that are the hallmark ofmany qualitative research studies (Morse & Richards, 2002). CAQDASprograms are extremely useful for allowing the researcher to code data—which involves disaggregating data by breaking data down into manageablecomponents and identifying or naming these segments (i.e., assigning it toa category). The text material also can be coded easily into new emergentconcepts, categories, or themes. Further, text stored in these new codingcategories can be compared (i.e., cross-referenced) as a function of indi-vidual- or group-based demographic information (i.e., attributes) or withrespect to responses to other coded questions (Bazeley, 2006).

Many CAQDAS programs (e.g., NVIVO) can find individual wordsand search patterns of words, coding, or attributes. These search resultsthen can be saved, allowing the researcher to build more questions on them(Morse & Richards, 2002), as is often undertaken in domain analysis,taxonomic analysis, and componential analysis. All CAQDAS programsallow the conduct of word count and KWIC by providing various types ofcounts such as the number of cases with a particular coding, the number oftext segments coded at a specific category, and the total number and/orproportion of characters or words coded, while retaining the text underly-ing the numeric information in order to provide situated meaning (Bazeley,2006). Another advantage of CAQDAS programs is that because they(can) record all the major analytic decisions made by the researcher, anaudit trail can be left. As noted by Dey (1993), “the ability to interrogatedata and revise conceptualizations through searching and retrieving cate-gorized data promises to introduce a new flexibility and rigour into qual-itative analysis” (p. 59).

At this point, a caution is needed about using software: CAQDASprograms can help researchers to analyze their data, but they cannotanalyze the data for researchers. Further, in using CAQDAS programs,flexibility, creativity, insight, and intuition should never be replaced by asystematic and mechanical analysis of qualitative data (Dey, 1993). Theresearcher is the main tool for analysis, regardless of whether a computerprogram is used to assist in the analysis (Denzin & Lincoln, 2005b). Forlarge data sets, there is no comparison available for computer programswhen it comes to data management.

578 Leech and Onwuegbuzie

Page 23: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

WHERE TO GO FROM HERE

As noted by Miles and Huberman (1994), “The strengths of qualitativedata rest on the competence with which their analysis is carried out” (p.10). We believe that an important step for school psychology researchers tobe competent analysts of qualitative data is awareness of multiple ways ofanalyzing qualitative data. Thus, the purpose of this article was to provideelements to increase understanding of the multiple types of analyses avail-able for school psychology researchers. Most researchers use one type ofanalysis and hope the results are trustworthy. The data analysis of theexample data analyzed throughout this paper exemplifies the need forresearchers to use more than one data analysis method. For example, if theresearcher had utilized word count alone, and focused on “different,” theissue of advocating would not have emerged. Likewise, if the researcherstops with only one domain, as shown in the text, the concept of takingyears to understand would not emerge.

Thus, we contend that researchers need to utilize at least two, if notmore, types of data analysis tools in order to triangulate results. Lincolnand Guba (1985) discuss the importance of triangulation, which is a meansof improving the rigor of the analysis by assessing the integrity of theinferences that one draws from more than one vantage point. Typically,triangulation involves the use of multiple data sources, multiple research-ers, multiple theoretical perspectives, and/or multiple methods (Denzin,1989; Schwandt, 2001). In particular, triangulation is viewed as involvingthe use of multiple data collection tools (Denzin, 1989; Oliver-Hoyo &Allen, 2006; Tuckett, 2005). However, we believe that the concept oftriangulation also should be extended to data analysis tools. In fact, build-ing on the conceptualizations of Onwuegbuzie and Teddlie (2003) andGreene, Caracelli, and Graham (1989), we contend that there are twomajor rationales for using multiple data analyses tools in qualitative re-search, namely: representation and legitimation. These two rationales andtheir components are represented in Figure 2.

Representation refers to the ability to extract adequate meaning fromthe underlying data. Using multiple qualitative data analyses allows theresearcher to utilize the strengths of each qualitative data analysis toolinvolved in order to understand better the phenomenon. The ability to “getmore out of the data” by using multiple analytical tools provides the schoolpsychology researcher the opportunity to generate more meaning, therebyenhancing the quality of inferences. As can be seen from Figure 2, there arefour major ways in which representation can be enhanced by using multiplequalitative data analysis techniques: within-method complementary (i.e.,seeking elaboration, enhancement, illustration, clarification of the results

579Qualitative Analysis Tools

Page 24: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

from one analytical tool with results from another tool); within-methodtriangulation (i.e., seeking convergence and corroboration of results fromdifferent analytical methods used on the same data); within-method ex-pansion (i.e., seeking to expand the breadth and range of inferences byusing different analytical tools for different data components); and within-method development (i.e., using the results from one analytical approachto help inform results from another analytical approach). As such, usingmultiple qualitative data analysis tools can help researchers to addresswhat Denzin and Lincoln (2005b) refer to as the crisis of representation,namely, the difficulty in capturing lived experiences via text.

Legitimation refers to the trustworthiness, credibility, dependability,confirmability, and/or transferability of the inferences made (Guba, 1981;Onwuegbuzie & Leech, 2007b). As noted by Onwuegbuzie and Leech(2004), lack of legitimation “means that the extent to which the data havebeen captured has not been adequately assessed, or that any such assess-ment has not provided support for legitimation” (p. 778). As can be seen

Figure 2. Typology of reasons for using multiple qualitative data analysis tools.

580 Leech and Onwuegbuzie

Page 25: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

from Figure 1, there are two major ways in which legitimation can beenhanced by using multiple qualitative data analysis techniques: within-method initiation (i.e., discovering paradoxes and contradictions that leadto a reframing of the research question) and within-method triangulation.As such, using multiple qualitative data analysis tools can help researchersto address what Denzin and Lincoln (2005b) refer to as the crisis oflegitimation, namely, the difficulty in assessing qualitative findings.

As has been illustrated in this article, by using multiple types ofanalyses, the qualitative data analysis process will become more rigorous.Thus, researchers need to pay attention to the array of available qualitativeanalyses tools, learn about the tools, and use them diligently. By doing so,qualitative research will become appealing to researchers from the field ofschool psychology and beyond.

REFERENCES

Adler, L. (1996). Qualitative research of legal issues. In D. Schimmel (Ed.), Research thatmakes a difference: Complementary methods for examining legal issues in education,NOLPE Monograph Series (No. 56, pp. 3–31). Topeka, KS: NOLPE.

Banister, P. Burman, E. Parker, I., Taylor, M., & Tindal, C. (1994). Qualitative methods inpsychology: A research guide. Buckingham: Open University Press.

Bazeley, P. (2006). The contribution of computer software to integrating qualitative andquantitative data and analyses. Research in the Schools, 13(1), 64–74.

Berg, B. L. (2004). Qualitative research methods for the social sciences (5th ed.). Boston:Pearson.

Bogdan, R. C., & Biklen, S. K. (2003). Qualitative research for education: An introduction totheories and methods (4th ed.). Boston: Pearson.

Bourdon, S. (2002, May). The integration of qualitative data analysis software in researchstrategies: Resistances and possibilities. Forum Qualitative Sozialforschung/Forum:Qualitative Social Research [On-line Journal], 3(2). Retrieved March 11, 2006, fromhttp://www.qualitative-research.net/fqs-texte/2–02/2–02bourdon-e.htm

Carley, K. (1993). Coding choices for textual analysis: A comparison of content analysis andmap analysis. In P. Marsden (Ed.), Sociological methodology (pp. 75–126). Oxford:Blackwell.

Casagrande, J. B., & Hale, K. L. (1967). Semantic relationships in Papago folk-definitions. InD. Hymes & W. E. Bittle (Eds.), Studies in Southwestern ethnolinguistics (pp. 165–196).The Hague: Mouton.

Chambers, E. (2000). Applied ethnography. In N. K. Denzin & Y. S. Lincoln (Eds.), Hand-book of qualitative research (2nd ed., pp. 851–869). Thousand Oaks, CA: Sage.

Charmaz, K. (2000). Grounded theory: Objectivist and constructivist methods. In N. K.Denzin & Y. S. Lincoln (Eds.), Handbook of qualitative research (2nd ed., pp. 509–535).Thousand Oaks, CA: Sage.

Coffey, A., & Atkinson, P. (1996). Making sense of qualitative data: Complementary researchstrategies. Thousand Oaks, CA: Sage.

Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psycholog-ical Measurement, 20, 37–46.

Collins, K. M. T., Onwuegbuzie, A. J., & Sutton, I. L. (2006). A model incorporating therationale and purpose for conducting mixed methods research in special education andbeyond. Learning Disabilities: A Contemporary Journal, 4, 67–100.

Connolly, P. (1998). ‘Dancing to the wrong tune’: Ethnography generalization and research on

581Qualitative Analysis Tools

Page 26: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

racism in schools. In P. Connolly & B. Troyna (Eds.), Researching racism in education:Politics, theory, and practice (pp. 122–139). Buckingham, UK: Open University Press.

Creswell, J. W. (2007). Qualitative inquiry and research design: Choosing among the fiveapproaches. (2nd ed.) Thousand Oaks, CA: Sage.

Denzin, N. K. (1978). The research act: A theoretical introduction to sociological methods. NewYork: Praeger.

Denzin, N. K. (1989). The research act: Theoretical introduction to sociological methods (3rded.). Englewood Cliffs, NJ: Prentice Hall.

Denzin, N. K., & Lincoln, Y. S. (Eds.). (2005a). The Sage handbook of qualitative research(3rd ed.). Thousand Oaks, CA: Sage.

Denzin, N. K., & Lincoln, Y. S. (2005b). Introduction: The discipline and practice of quali-tative research. In N. K. Denzin & Y. S. Lincoln (Eds.), The Sage handbook of qualitativeresearch (3rd ed., pp. 1–32). Thousand Oaks, CA: Sage.

Dey, I. (1993). Qualitative data analysis: A user-friendly guide for social scientists. London:Routledge.

Eisner, E. W. (1998). The enlightened eye: Qualitative inquiry and the enhancement ofeducational practice. Upper Saddle River, NJ: Merrill Prentice Hall.

Emerson, R. M., Fretz, R. I., & Shaw, L. L. (1995). Writing ethnographic fieldnotes. Chicago:The University of Chicago Press.

Fielding, N. G., & Lee, R. M. (1998). Computer analysis and qualitative research. ThousandOaks, CA: Sage.

Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies forqualitative research. Chicago: Aldine.

Glesne, C., & Peshkin, A. (1992). Becoming qualitative researchers: An introduction. WhitePlains, NY: Longman.

Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework formixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11,255–274.

Gresham, F. M., MacMillan, D. L., Beebe-Frankenberger, M. E., & Bocian, K. M. (2000).Treatment integrity in learning disabilities research: Do we really know how treatmentsare implemented? Learning Disabilities Research & Practice, 15, 198–205.

Guba, E. G. (1981). ERIC/ECTJ annual review paper: Criteria for assessing the trustworthi-ness of naturalistic inquiries. Educational Communication and Technology: A Journal ofTheory, Research, and Development, 29, 75–91.

Halpern, E. S. (1983). Auditing naturalistic inquiries: The development and application of amodel. Unpublished doctoral dissertation, Indiana University, Bloomington.

Hatch, J. A. (2002). Doing qualitative research in education settings. Albany: SUNY Press.Hess, R., Molina, A., & Kozleski, E. B. (2006). Until somebody hears me: Parental voice and

advocacy in special education decision-making. British Journal of Special Education, 33,148–157.

Janesick, V. J. (2000). The choreography of qualitative research design: Minuets, improvisa-tions, and crystallization. In N. K. Denzin & Y. S. Lincoln, (Eds.) Handbook ofqualitative research (2nd ed.). Thousand Oaks, CA: Sage.

Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigmwhose time has come. Educational Researcher, 33, 14–26.

Kelle, U. (Ed.) (1996). Computer-aided qualitative data analysis. Thousand Oaks, CA: Sage.Kratochwill, T. R., & Stoiber, K. C. (2000). Uncovering critical research agendas for school

psychology. School Psychology Review, 29, 591–603.Leech, N. L. (2004). Types of qualitative and quantitative analyses used by faculty researchers.

Unpublished manuscript, University of Colorado at Denver and Health Sciences Center.Leech, N. L., & Goodwin, L. (2004). Research courses in Ph.D. programs across the United

States: Accountability and trends. Unpublished manuscript, University of Colorado atDenver and Health Sciences Center.

Leech, N. L., & Onwuegbuzie, A. J. (2005, April). Qualitative data analysis: Ways to improveaccountability in qualitative research. Paper presented at the annual meeting of theAmerican Educational Research Association, Montreal, Canada.

Lewins, A., & Silver, C. (2005). Choosing a CAQDAS package: A working paper. Retrieved

582 Leech and Onwuegbuzie

Page 27: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

March 11, 2006, from http://caqdas.soc.surrey.ac.uk/ChoosingLewins&SilverV3Nov05.pdf

Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Beverly Hills, CA: Sage.Maxwell, J. A. (1992). Understanding and validity in qualitative research. Harvard Educa-

tional Review, 62, 279–299.Maxwell, J. A. (2005). Qualitative research design: An interactive approach (2nd ed.). Newbury

Park, CA: Sage.Merriam, S. B. (1998). Qualitative research and case study applications in education (2nd ed.).

San Francisco: Jossey-Bass.Merriam-Webster. (2005). Merriam-Webster Online Dictionary. Retrieved January 25, 2005 from

http://www.mw.com/cgibin/dictionary?book�Dictionary&va�symbol&x�10&y�7Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook

(2nd ed.). Thousand Oaks, CA: Sage.Morse, J. M., & Richards, L. (2002). Read me first for a user’s guide to qualitative methods.

Thousand Oaks, Sage.Nastasi, B. K., & Schensul, S. L. (Eds.) (2005). Contributions of qualitative research to the

validity of intervention research. Special issue of Journal of School Psychology, 43,177–195.

Oliver-Hoyo, M., & Allen, D. (2006). The use of triangulation methods in qualitativeeducational research. Journal of College Science Teaching, 35, 42–47.

Onwuegbuzie, A. J. (2003). Expanding the framework of internal and external validity inquantitative research. Research in the Schools, 10, 71–90.

Onwuegbuzie, A. J., & Leech, N. L. (2004). Enhancing the interpretation of “significant”findings: The role of mixed methods research. The Qualitative Report, 9, 770–792.Retrieved March 8, 2005, from http://www.nova.edu/ssss/QR/QR9–4/ onwuegbuzie.pdf

Onwuegbuzie, A. J., & Leech, N. L. (2006). Linking research questions to mixed methods dataanalysis procedures. The Qualitative Report, 11, 474–498. Retrieved June 6, 2007, fromhttp://www.nova.edu/sss/QR/QR11-3/onwuegbuzie.pdf

Onwuegbuzie, A. J., & Leech, N. L. (2007a) A call for qualitative power analyses. Quality &Quantity: International Journal of Methodology, 41, 105–121.

Onwuegbuzie, A. J., & Leech, N. L. (2007b). Validity and qualitative research: An oxymoron?Quality & Quantity: International Journal of Methodology, 41, 233–249.

Onwuegbuzie, A. J., & Teddlie, C. (2003). A framework for analyzing data in mixed methodsresearch. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in socialand behavioral research (pp. 351–383). Thousand Oaks, CA: Sage.

Onwuegbuzie, A. J., & Wilson, V. A. (2003). Statistics anxiety: Nature, etiology, antecedents,effects, and treatments: A comprehensive review of the literature. Teaching in HigherEducation, 8, 195–209.

Pennebaker, J. W., Mehl, M. R., & Niederhoffer (2003). Psychological aspects of naturallanguage use: Our words, our selves. Annual Review of Psychology, 54, 547–577.

Phillips, L. J., & Jorgensen, M. W. (2002): Discourse analysis as theory and method. ThousandOaks, CA: Sage.

Powell, H., Mihalas, S., Onwuegbuzie, A. J., Suldo, S., & Daley, C. E. (in press). Mixedmethods research in school psychology: A mixed methods investigation of trends in theliterature. Psychology in the Schools.

QSR International Pty Ltd. (2006). NVIVO: Version 7. Reference guide. Doncaster Victoria,Australia: Author.

Richardson, L., & St. Pierre, E. A. (2005). Writing: A method of inquiry. In N. K. Denzin &Y. S. Lincoln (Eds.), The Sage handbook of qualitative research (3rd ed., pp. 959–978).Thousand Oaks, CA: Sage.

Ryan, G. W., & Bernard, H. R. (2000). Data management and analysis methods. In N. K.Denzin & Y. S. Lincoln (Eds.), Handbook of qualitative research (2nd ed., pp. 769–802).Thousand Oaks, CA: Sage.

Sandelowski, M. (2001). Real qualitative researchers don’t count: The use of numbers inqualitative research. Research in Nursing & Health, 24, 230–240.

Schram, T. H. (2003). Conceptualizing qualitative inquiry: Mindwork for fieldwork in educa-tion and the social sciences. Upper Saddle River, NJ: Pearson.

583Qualitative Analysis Tools

Page 28: An Array of Qualitative Data Analysis Tools: A Call for Data Analysis

Schwandt, T. A. (2001). Dictionary of qualitative inquiry (2nd ed.). Thousand Oaks, CA: Sage.Sechrest, L., & Sidana, S. (1995). Quantitative and qualitative methods: Is there an alterna-

tive? Evaluation and Program Planning, 18, 77–87.Shank, G. D. (2002). Qualitative research: A personal skills approach. Upper Saddle River, NJ:

Merrill Prentice Hall.Siegel, S., & Castellan, J. N. (1988). Nonparametric statistics for the behavioural sciences.

McGraw-Hill, New York.Silverman, D. (2001). Interpreting qualitative data. London: Sage.Spradley, J. P. (1979). The ethnographic interview. For Worth, TX: Holt, Rinehart and

Winston.Stake, R. E. (2005). Qualitative case studies. In N. K. Denzin & Y. S. Lincoln (Eds.), The Sage

handbook of qualitative research (3rd ed., pp. 443–466). Thousand Oaks, CA: Sage.Tashakkori, A., & Teddlie, C. (Eds.). (2003). Handbook of mixed methods in social and

behavioral research. Thousand Oaks, CA: Sage.Tesch, R. (1990). Qualitative research: Analysis types and software tools. London: Falmer.Todd, Z., Nerlich, B., & McKeown, S. (2004). Introduction. In Z. Todd, B. Nerlich, S.

McKeown, & D. D. Clarke (Eds.), Mixing methods in psychology: The integration ofqualitative and quantitative methods in theory and practice (pp. 3–16). Hove, East Sussex:Psychology Press.

Tuckett, A. G. (2005). Rigour in qualitative research: Complexities and solutions. NurseResearcher, 13, 29–42.

Ulin, P. R., Robinson, E. T., & Tolley, E. E. (2005). Qualitative methods in public health: Afield guide for applied research. San Francisco: Jossey-Bass.

Weitzman, E., & Miles, M. B. (1995). Computer programs for qualitative data analysis.Thousand Oaks, CA: Sage.

Wolcott, H. F. (2001). Writing up qualitative research (2nd ed.). Thousand Oaks, CA: Sage.

584 Leech and Onwuegbuzie