Volume 6, No. 3, Art. 15 – September 2005
Review:
Marilyn Lichtman
Janet Heaton (2004). Reworking Qualitative Data. London: Sage, 160 pages, Cloth (ISBN 0-7619-7142-4) $115 / Paper (ISBN 0-7619-7143-2) $36,95
Abstract: For more than a quarter of a century, researchers have systematically analyzed quantitative data from secondary data sets. Many of these data sets are widely available and researchers are encouraged to explore new questions. This widely accepted practice is being expanded to include qualitative data sets and, in this book, HEATON provides a fairly extensive review of such efforts, especially in the United Kingdom. She identifies some sixty-five studies and uses them to explore issues about procedures, ethics and epistemology. Further, she puts forth the idea that conducting qualitative secondary analyses might become a new methodology. One interesting idea she presents is that there are different characteristics of secondary analyses, varying by function, focus, type of data and source of data. Just over 100 pages, this book will provide you with some new insights into the practice of secondary analysis. I was left, however, with wanting more than HEATON provided.
Key words: secondary analysis, qualitative analysis, qualitative analysis as methodology
Table of Contents
1. Introduction
2. Issues Regarding Secondary Analysis
3. Making Her Case
4. Additional Speculations
No doubt most of you are familiar with secondary analysis of quantitative data. Researchers have routinely explored large data sets, asking new questions and using extant data to seek answers. But this practice has not been adapted to any extent with qualitative data, at least in the United States. This is not the case in the UK and parts of Europe, however. Before you begin reading HEATON's book on qualitative data, you might want to review some of the other writing on the topic. Most of what is published has been explored in several issues of FQS. See for example the discussion—in FQS 1(3) and in FQS 6(1)—by CORTI (2000, 2005) and most recently a series of contributions on the topic of research, archiving and reuse of qualitative data in FQS 6(2). [1]
CORTI's 2000 article provides a comprehensive overview about the issues and problems related to preserving and archiving qualitative data. Her focus is on a large-scale effort in the UK under the aegis of the Economic and Social Research Council (ESRC) Qualitative Data Archival Resource Centre. She addresses many of the issues surrounding establishing such archives and the strong feelings by researchers both supportive and against the idea. She considers such key issues as priorities for acquisition, funding and access. I don't want to repeat the entire contents of FQS 1(3). Suffice it to say that you should read this as a background for examining HEATON's book. Some five years later, CORTI 2005—in FQS 6(1)—in her role as organizer of a conference on secondary analysis of qualitative data, introduces a set of papers on issues of preserving the context of the research and potential problems with decontextualizing archived data. You might also find the case studies interesting. [2]
Most recently, in FQS 6(2) BERGMAN and EBERLE served as editors of a series of articles based on a workshop held in Switzerland with contributors from major European institutions. Many of the articles address issues of archiving and acquisition of qualitative data. As I write this review, I think about my experience in the United States. I do not believe that there is any similar movement like the one in the UK or in Europe. Perhaps one is needed, but, at this writing, I have not seen it. [3]
HEATON explores the idea that secondary data analysis can be extended from the quantitative domain to the qualitative arena. Further, she argues that such analysis should be considered a methodology of qualitative inquiry. While HEATON acknowledges that doing secondary analysis is not new, she suggests that advances in archiving data and in computing have led to an increased interest in the United States and the United Kingdom. Being located at the University of York, she draws her illustrations from efforts in the UK to archive and make available data sets, especially in health and social care, but she acknowledges that secondary analyses are also taking place in such areas are education and criminology. You should be aware that HEATON's discussion is based on the ESDS Qualidata (Economic and Social Data Service). This effort represents one of the largest data sets available that supports acquisition, dissemination and re-use of qualitative data. Supported by the Universities and Essex and Manchester, it serves as a repository of various types of qualitative data. [4]
I must say that I was very interested in the areas HEATON chose to cover. After defining secondary analysis in general and comparing qualitative secondary analysis with the more familiar quantitative secondary analysis, she addresses epistemological, legal and ethical issues. Practicalities of how to do such an analysis are also covered. She concludes with speculations about the future. Her primary objective, I believe, is to legitimize and raise qualitative secondary analysis into a methodology. [5]
HEATON's book is written in seven chapters. Much of the information she covers is based on her review of some sixty-five studies emanating from the UK. Chapter 1 addresses the general topic of secondary analysis of data. She covers two main topics. First, she compares quantitative data analysis with qualitative data analysis. She also introduces the concept that qualitative data analysis should be considered a methodology of qualitative research and not just a technique. I am not particularly persuaded, however, with the idea that this should be seen as "a methodology for investigating new research questions" (p.16). In fact, one thing that I think is missing from the book is a discussion regarding details of how analyses have been done or could be done. [6]
A history of secondary analysis is provided in chapter 2 as well as a discussion about the pros and cons of such analyses. Chapter 3 identifies five types of secondary analysis: supra analysis, supplementary analysis, re-analysis, amplified analysis and assorted analysis. Supra analysis looks at new questions, whether empirical, theoretical or methodological. Supplementary analysis considers a more in-depth investigation that goes beyond the original questions. Re-analysis is used for verification or corroboration. Amplified analysis combines data from several studies as a way to enlarge the sample. Finally, assorted analysis combines primary and secondary analysis. I suspect few of you are aware of these different types. She concludes this chapter by suggesting that the methodology is not as well established as it is in quantitative secondary analysis. I couldn't agree with her more. I was somewhat disappointed that these five categories of analyses are not amplified as she continues with her assessment of available studies. [7]
She raises several epistemological problems in chapter 4. She speaks about tensions between qualitative and quantitative research and raises issues about data fit, not being there and verification. Throughout her book, she compares quantitative and qualitative secondary analysis practices. For example, while she sees "data fit" as a potential problem with either type, she particularly suggests the problem becomes more pronounced because of the flexible nature of qualitative data. She raises the issue of "seeing through the eyes of others," a tenet of qualitative research. However, she doesn't feel that it presents a particular problem. [8]
She addresses ethical and legal issues in chapter 5. For example, she considers issues of informed consent, confidentiality, copyright and data protection. While HEATON acknowledges concerns about such issues, in her review of studies she points out that the issues were not really addressed. In chapter 6 she provides practical suggestions on re-using qualitative data, and addresses the future in her final chapter. I address her suggestions in a later section of this review. [9]
HEATON acknowledges that secondary analysis using statistical data has been used throughout the twentieth century, but that the first text was not published until the early seventies (HYMAN 1972). I believe it is the Internet and computer that facilitate such qualitative secondary analysis. Our experience with large scale secondary analysis of quantitative data in the United States began in the 1970s. The widespread availability of large data sets made available by the National Center for Education Statistics (http://nces.ed.gov/surveys/hsb/), especially the National Education Longitudinal Studies of high school students, begun in 1972 with follow up studies in 1980 and 1988, spawned an enormous resource for secondary data analysis for researchers and graduate students. Data were made available on tape and disc and hundreds of papers were published utilizing various statistical techniques such as multiple regression techniques and factor analysis with the aim of investigating school achievement and related factors. The technology available in the 21st century has dramatically changed the way such analyses are accessed and conducted. You might find it interesting to look at an interactive textbook in the field of health research. Supported by the National Institutes of Health in the United States, BIERMAN and BUBOLZ (no date) provide excellent examples of issues related to such analyses. [10]
It was not until the middle 1990s that researchers began to consider issues relating to secondary analysis of qualitative data (CORTI 2000; CORTI, DAY & BACKHOUSE 2000, FINK 2000, HAMMERSLEY 1997). I found HEATON's discussion about the types of qualitative data interesting. She distinguishes between data gathered for research studies—what she calls non-naturalistic or "artefactual" data—(e.g. field notes, interviews, or focus groups) and naturalistic data (e.g. diaries, letters, or photographs). HEATON's focus is on data that come from research studies for "the purposes of investigating new questions or verifying previous studies" (p.16). I suspect that she is less inclined to use such analysis for verification than for the examination of new questions. HEATON argues that such analysis should be considered a methodology rather than just a further data analysis task. As I say later on, I am not convinced. [11]
Until recently, qualitative data were not available to researchers who wanted to investigate further, using new questions. Researchers seemed almost proprietary about their data and students were told not to permit their data to go beyond their file cabinets. Issues of confidentiality and quality as well as the more practical issues of how to get the data "out there" most certainly influenced this practice. HEATON adds other factors to the increase: the practice and policy of data sharing and data retention as well as view of academics about potential problems and benefits. [12]
2. Issues Regarding Secondary Analysis
Many writers like to place things in categories. I suspect it is a way of organizing their own thoughts and trying to make sense of them. Sometimes these categories exist before looking at the things to be sorted, but often they emerge when examining the items. The practice is certainly not limited to research or to secondary analysis. Here is one example from the field of photography: BARRETT (2000) asks us to organize photographs based on such categories as descriptive, interpretive, evaluative, aesthetic and theoretical. Sometimes a photograph fits the categories; at other times it appears to me that the categories are often overlapping and photos can be sorted into more than one group. I am also intrigued with the parallel between these categories and those of qualitative research types. [13]
HEATON also identifies categories into which qualitative secondary analysis work might fit. She conducts a meta-analysis of some sixty-five qualitative studies, most of which were published after 1990. Like BARRETT, she chose to sort the various studies into categories. My guess is that these categories emerged from the data as she began to think about the studies she used. She uses the following categories: supra analysis (an examination of new questions that transcend the first study), supplementary analysis (in-depth investigation of an emerging issue), re-analysis (a re-investigation of the primary question), amplified analysis (combining data from two or more primary studies), and assorted analysis (using both primary and secondary data sets). She concludes that the methodology is not as well established as in quantitative secondary analysis. I thought these categories were somewhat forced and not especially helpful. I like to think of categories as fluid and flexible, rather like qualitative research, perhaps akin to the photography analogy. As new techniques emerge, categories would be modified to fit. [14]
One of the important contributions of HEATON's work is her detailed explanation of the practicalities of doing such analyses. In her chapter called Modi Operandi, she identifies six areas for consideration: design, selection of data set(s), analysis, quality assurance, reportage and reflexivity. Her stated intention is to "critically describe and review existing practice" (p.89). She raises issues related to location of data sets and determining suitability, accessibility and quality. In terms of analysis, she emphasizes how adaptations can be made, especially if doing grounded theory. Triangulation, member checks and audit trails are also addressed briefly. When I finished reading this chapter, however, I couldn't say that I had any more practical suggestions. While she addresses some issues, I think she could go into greater depth than she does. [15]
In the chapter on epistemological issues, HEATON suggests that researchers often reshape or add additional data. In fact, she suggests that very few of the secondary analyses were based solely on reusing data collected by others. Her review of extant studies suggests that those who conduct secondary analyses adopt practical and innovative ways to use pre-existing data. While she raises various ethical and legal issues, she believes that additional exploration regarding informed consent, confidentiality and copyright agreements needs to be considered. [16]
HEATON concludes with statements about the future. She acknowledges that "secondary analysis does have the makings of a qualitative methodology" (p.124), but that various epistemological, ethical and methodological issues are yet to be resolved. I don't find sufficient examples or evidence that secondary analysis of qualitative data will become a separate methodology, although I certainly see how archiving and access provide greater access. [17]
I believe HEATON writes on a topic that has had little attention. She introduces the reader to some novel ideas. Is there more than one type of secondary analysis? How does qualitative secondary analysis compare with quantitative secondary analysis? Can the act of analysis be considered a legitimate new methodology? What is currently out there in terms of analyses? I do not recall reading about these topics in other work and she whets our appetite for more. [18]
The data she provides is that secondary analyses are performed either by the primary researcher alone or in conjunction with others. She does not really get into why these analyses were performed, other than to say that new questions emerged. I believe it is also possible that these researchers had access to the data and thought they could do more with them. I know in the quantitative analysis field, researchers often begin with the data and then look for questions. Whether they would acknowledge that they followed this direction, however, is another matter. [19]
What makes this a new methodology and not just further analyses of data? I would like to see her address this issue in greater detail since I am not quite convinced by her argument. For example, audit trails and triangulation are issues more closely aligned with those who look for structure in qualitative research. She says that qualitative secondary analysis is used most often by North American researchers using grounded theory. I think her personal view is reflected at the end of her work where she hopes that qualitative secondary analysts could develop studies informed by newer, post-modern perspectives in which more flexible and innovative styles of research are used. While I might agree that this should be what the future holds, her analysis of what is currently out there does not really go in this direction. [20]
I would have liked to have seen some examples of re-analyses and comparisons between original and secondary analyses. I doubt that the actual writing looks different. I am also not sure that it is helpful to try to draw comparisons between quantitative and qualitative secondary analysis. By doing so I think she tries to make the process parallel. I wonder whether it is or needs to be. [21]
I would like to see her go much further into how the computer and the Internet can serve as archives for data sets to be used by researchers around the world. Awareness, storage, access and retrieval are critical. Unlike large quantitative data sets, most qualitative data are collected at a local and personal level. They do not exist in data files in a systematic manner. How would a research learn about a particular data set unless he or she were involved with it? How would access be granted? I think these questions are critical as we move into a new information age. [22]
Using qualitative computer software is only casually mentioned in her book. If researchers were to access data and conduct a reanalysis, what use, if any, would they make of such software? And what about the practical problems of data formats, various word processing packages, and even the potential incompatibility between PC and Mac computers? [23]
When I completed this book, I asked myself this question: Is there really a movement in the field for researchers to analyze existing data sets other than those with which they have had major responsibility? I do not believe so, but it might happen in the future. The close familiarity with the data that comes from having been involved in collecting it initially was there for most of the researchers. If HEATON were to do her analysis in 2005 and beyond, would she find a groundswell of support for this kind of secondary analysis by others? I think that question remains to be answered. [24]
In this day of the information age and instant access, I believe the possibilities exist for this potentially new way of thinking about data. I hope that HEATON continues to be motivated to explore this new field. [25]
Barrett, Terry (2000). Criticizing Photographs: An Introduction to Understanding Images (3rd edition). New York: McGraw Hill.
Bierman, Arlene & Bubolz, Thomas (no date). Chapter 20. Secondary Analysis of Large Survey Databases. In Mitchell Max & Joanne Lynne (Eds.), Interactive Textbook on Symptom Research: Methods and Opportunities. Available at: http://painconsortium.nih.gov/symptomresearch/chapter_20/index.htm [Date of access: June, 30, 2005].
Corti, Louise (2000, December). Progress and Problems of Preserving and Providing Access to Qualitative Data for Social Research—The International Picture of an Emerging Culture [58 paragraphs]. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research [Online Journal], 1(3), Art. 2. Available at: http://www.qualitative-research.net/fqs-texte/3-00/3-00corti-e.htm [Date of access: June, 30, 2005].
Corti, Louise; Day, Annette & Backhouse, Gill (2000, December). Confidentiality and Informed Consent: Issues for consideration in the preservation of and provision of access to qualitative data archives [46 paragraphs]. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research [Online Journal], 1(3), Art. 7. Available at: http://www.qualitative-research.net/fqs-texte/3-00/3-00cortietal-e.htm [Date of access: June, 30, 2005].
ESDA Qualitative Online. http://www.esds.ac.uk/qualidata/online/ [Date of access: June, 30, 2005].
Fink, Anne Sofie (2000, December). The Role of the Researcher in the Qualitative Research Process. A Potential Barrier to Archiving Qualitative Data [69 paragraphs]. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research [Online Journal], 1(3), Art. 4. Available at: http://www.qualitative-research.net/fqs-texte/3-00/3-00fink-e.htm [Date of access: June, 30, 2005].
Hammersley, Martyn (1997). Qualitative Data Archiving: Some Reflections on its Prospects and Problems. Sociology, 31(1), 131-142.
Hyman, Herbert (1972). Secondary Analysis of Sample Surveys: Principles, Procedures, and Potentialities. New York: John Wiley.
National Center for Education Statistics. http://nces.ed.gov/surveys/hsb/ [Date of access: June, 30, 2005].
Marilyn LICHTMAN retired as a professor of educational research and evaluation from Virginia Tech, Falls Church and Blacksburg, Virginia. She has taught qualitative research methods for more than a dozen years. Her book on qualitative research in education is forthcoming from Sage Publications. Her research interests involve alternative methods of teaching qualitative research. She serves on the editorial boards of FQS and The Qualitative Report. She is completing her 12th year as a docent at the Corcoran Gallery of Art in Washington, DC, one of the oldest private art museums in the United States where she conducts tours for children and adults. In previous FQS issues Marilyn LICHTMAN reviewed The NVivo Qualitative Project Book (by BAZELEY & RICHARDS 2000) and Visual Methodologies (by ROSE 2001).
Contact:
Marilyn Lichtman
Educational Research and Evaluation Program Area, College of Education
Virginia Tech
5809 Nicholson Lane, #511, Rockville, MD 20852, USA
E-mail: mlichtma@vt.edu
Lichtman, Marilyn (2005). Review: Janet Heaton (2004). Reworking qualitative data [25 paragraphs]. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 6(3), Art 15, http://nbn-resolving.de/urn:nbn:de:0114-fqs0503150.