Volume 6, No. 2, Art. 27 – May 2005
"Emergence" vs. "Forcing" of Empirical Data? A Crucial Problem of "Grounded Theory" Reconsidered
Udo Kelle
Abstract: Since the late 1960s Barney GLASER and Anselm STRAUSS, developers of the methodology of "Grounded Theory" have made several attempts to explicate, clarify and reconceptualise some of the basic tenets of their methodological approach. Diverging concepts and understandings of Grounded Theory have arisen from these attempts which have led to a split between its founders.
Much of the explication and reworking of Grounded Theory surrounds the relation between data and theory and the role of previous theoretical assumptions. The book which initially established the popularity of GLASER's and STRAUSS' methodological ideas, "The Discovery of Grounded Theory" (1967), contains two conflicting understandings of the relation between data and theory—the concept of "emergence" on the one hand and the concept of "theoretical sensitivity" on the other hand. Much of the later developments of Grounded Theory can be seen as attempts to reconcile these prima facie diverging concepts. Thereby GLASER recommends to draw on a variety of "coding families" while STRAUSS proposes the use of a general theory of action to build an axis for an emerging theory.
This paper first summarises the most important developments within "Grounded Theory" concerning the understanding of the relation between empirical data and theoretical statements. Thereby special emphasis will be laid on differences between GLASER's and STRAUSS' concepts and on GLASER's current critique that the concepts of "coding paradigm" and "axial coding" described by STRAUSS and Juliet CORBIN lead to the "forcing" of data. It will be argued that GLASER's critique points out some existing weaknesses of STRAUSS' concepts but vastly exaggerates the risks of the STRAUSSian approach. A main argument of this paper is that basic problems of empirically grounded theory construction can be treated much more effectively if one draws on certain results of contemporary philosophical and epistemological discussions and on widely accepted concepts developed in such debates. This especially refers to the critique of naive empiricism, to the concept of hypothetical or abductive inference, to the concept of empirical content or falsifiability of statements and to the concept of corroboration.
Key words: grounded theory, induction, abduction, theoretical sensitivity, coding paradigm, theory building
Table of Contents
1. How Do Categories "Emerge" From the Data? "Theoryladenness" of Observations as a Problem for Grounded Theory Methodology
2. Different Approaches in Grounded Theory to Solve the Problem
2.1 GLASER's approach: theoretical coding with the help of "coding families"
2.2 STRAUSS' and CORBIN's approach: axial coding and the coding paradigm
3. The Split Between GLASER and STRAUSS in the 1990s
4. Towards a Clearer Understanding of the "Grounding" of Categories and Theories
4.1 Abductive inference as a logical foundation of theory building
4.2 Empirical content or falsifiability as a criterion for the applicability of theoretical preconceptions in qualitative inquiry
4.3 The necessity of corroboration of empirically grounded categories and hypotheses
5. Conclusive Remarks
1. How Do Categories "Emerge" From the Data? "Theoryladenness" of Observations as a Problem for Grounded Theory Methodology
Can the claim to discover theoretical categories and propositions from empirical data be reconciled with the fact that researchers always have to draw on already existing theoretical concepts when analysing their data? In the past three decades following the publication of GLASER's and STRAUSS' famous methodological monograph "The Discovery of Grounded Theory" (1967) both authors have made several attempts to make these two conflicting methodological requirements compatible. [1]
One of the main purposes of GLASER's and STRAUSS' "Discovery book" was to challenge the hypothetico-deductive approach which demands the development of precise and clear cut theories or hypotheses before the data collection takes place. GLASER and STRAUSS criticised the "overemphasis in current sociology on the verification of theory, and a resultant de-emphasis on the prior step of discovering what concepts and hypotheses are relevant for the area that one wishes to research" (GLASER & STRAUSS 1967, pp.1f.) and bemoaned "that many of our teachers converted departments of sociology into mere repositories of 'great-man' theories" (ibid., p.10) leading to an antagonism between "theoretical capitalists" and a mass of "proletariat testers" (p.11). Thus the Discovery book was an attempt to strengthen the cause of researchers and doctoral students who formed this scientific proletariat:
"(...) we are also trying, through this book, to strengthen the mandate for generating theory, to help provide a defense against doctrinaire approaches to verification (...). It should also help students to defend themselves against verifiers who would teach them to deny the validity of their own scientific intelligence" (p.7). [2]
GLASER and STRAUSS proposed a "general method of comparative analysis" which would allow for the "emergence" of categories from the data as an alternative to the hypothetico-deductive approach in social research. "We suggest as the best approach an initial, systematic discovery of the theory from the data of social research. Then one can be relatively sure that the theory will fit and work" (p.3). Following the Discovery book a crucial measure against the forcing of data into a procrustean bed would be to "literally to ignore the literature of theory and fact on the area under study, in order to assure that the emergence of categories will not be contaminated ..." (p.37). [3]
Ironically, such a stance represents one of the roots of positivist epistemology. In the early days of modern natural sciences in the 17th and 18th century the most early empiricist philosophers like Francis BACON or John LOCKE were convinced that the only legitimate theories were those which could be inductively derived by simple generalisation from observable data. Following BACON, one of the most important tasks of an empirical researcher was to free his or her mind from any theoretical preconceptions and "idols" before approaching empirical data. However, since Immanuel KANT's sophisticated critique of the pitfalls of early empiricism (nowadays often called "naïve empiricism" or "naïve inductivism", cf. CHALMERS 1999) this epistemological position has lost most of its proponents—and even most of the followers of "Logical Positivism" in the 1930s did not adhere to it. The idea that researchers could approach reality "as it is" if they are prepared to free the mind from any preconceived ideas whatsoever has fallen into deserved bad reputation in contemporary epistemology.
"Both historical examples and recent philosophical analysis have made it clear that the world is always perceived through the 'lenses' of some conceptual network or other and that such networks and the languages in which they are embedded may, for all we know, provide an ineliminable 'tint' to what we perceive" (LAUDAN 1977, p.15). [4]
It is impossible to free empirical observation from all theoretical influence since already "(...) seeing is a 'theory-laden' undertaking. Observation of x is shaped by prior knowledge of x" (HANSON 1965, p.19). Since the 1960s it is one of the most crucial and widely accepted insights of epistemology and cognitive psychology that "there are and can be no sensations unimpregnated by expectations" (LAKATOS 1978, p.15) and that the construction of any theory, whether empirically grounded or not, cannot start ab ovo, but has to draw on already existing stocks of knowledge. At the same time this philosophical critique of inductivism and the emphasis on the "theoryladenness" of observation also highlights the role of previous knowledge in hermeneutic Verstehen (KELLE 1995, p.38): Qualitative researchers who investigate a different form of social life always bring with them their own lenses and conceptual networks. They cannot drop them, for in this case they would not be able to perceive, observe and describe meaningful events any longer—confronted with chaotic, meaningless and fragmented phenomena they would have to give up their scientific endeavour. [5]
The infeasibility of an inductivist research strategy which demands an empty head (instead of an "open mind") cannot only be shown by epistemological arguments, it can also be seen in research practice. Especially novices in qualitative research with the strong desire to adhere to what they see as a basic principle and hallmark of Grounded Theory—the "emergence" of categories from the data—often experience a certain difficulty: in open coding the search for adequate coding categories can become extremely tedious and a subject of sometimes numerous and endless team sessions, especially if one hesitates to explicitly introduce theoretical knowledge. The declared purpose to let codes emerge from the data then leads to an enduring proliferation of the number of coding categories which makes the whole process insurmountable. In a methodological self-reflection a group of junior researchers who had asked me for methodological advice described this proliferation of code categories as follows:
"Especially the application of an open coding strategy recommended by Glaser and Strauss—the text is read line by line and coded ad hoc—proved to be unexpectedly awkward and time consuming. That was related to the fact that we were doing our utmost to pay attention to the respondents' perspectives. In any case we wanted to avoid the overlooking of important aspects which may lay behind apparently irrelevant information. Our attempts to analyze the data were governed by the idea that we should address the text tabula rasa and by the fear to structure data to much on the basis of our previous knowledge. Consequently every word in the data was credited with high significance. These uncertainties were not eased by advice from the corresponding literature that open coding means a 'preliminary breaking down of data' and that the emerging concepts will prove their usefulness in the ongoing analysis. Furthermore, in the beginning we had the understanding that 'everything counts' and 'everything is important'—every yet marginal incident and phenomenon was coded, recorded in numerous memos and extensively discussed. This led to an unsurmountable mass of data ..." (cf. KELLE, MARX, PENGEL, UHLHORN & WITT, 2002, translation by UK). [6]
A more thorough look at the Discovery book reveals that GLASER and STRAUSS were aware of that problem, since they wrote: "Of course, the researcher does not approach reality as a tabula rasa. He must have a perspective that will help him see relevant data and abstract significant categories from his scrutiny of the data" (GLASER, STRAUSS 1967, p.3). [7]
GLASER and STRAUSS coined the term "theoretical sensitivity" to denote the researcher's ability to "see relevant data", that means to reflect upon empirical data material with the help of theoretical terms. "Sources of theoretical sensitivity build up in the sociologist an armamentarium of categories and hypotheses on substantive and formal levels. This theory that exists within a sociologist can be used in generating his specific theory (...)" (ibid., p.46). But how can a researcher acquire such an armamentarium of categories and hypotheses? The Discovery book only contains a very short clue on the "great man theorists", which "(...) have indeed given us models and guidelines for generating theory, so that with recent advances in data collection, conceptual systematization and analytic procedures, many of us can follow in their paths" (p.11). One may find this remark surprising given the sharp criticism of "theoretical capitalists" launched elsewhere in the book. Furthermore the authors write that an empirically grounded theory combines concepts and hypotheses which have emerged from the data with "some existing ones that are clearly useful" (p.46). However, in the Discovery book clear advice on how this combination can be pursued is missing. [8]
Consequently, in the most early version of Grounded Theory the advice to employ theoretical sensitivity to identify theoretical relevant phenomena coexists with the idea that theoretical concepts "emerge" from the data if researchers approach the empirical field with no preconceived theories or hypotheses. Both ideas which have conflicting implications are not integrated with each other in the Discovery book. Furthermore, the concept of theoretical sensitivity is not converted into clear cut methodological rules: it remains unclear how a theoretically sensitive researcher can use previous theoretical knowledge to avoid drowning in the data. If one takes into account the frequent warnings not to force theoretical concepts on the data one gets the impression that a grounded theorist is advised to introduce suitable theoretical concepts ad hoc drawing on implicit theoretical knowledge but should abstain from approaching the empirical data with ex ante formulated hypotheses. [9]
2. Different Approaches in Grounded Theory to Solve the Problem
2.1 GLASER's approach: theoretical coding with the help of "coding families"
Much of GLASER's and STRAUSS' later methodological writings can be understood as attempts to account for the "theoryladenness" of empirical observation and to bridge the gap between "emergence" and "theoretical sensitivity". These attempts followed two different lines: [10]
On the one hand, Barney GLASER tried to clarify the concept of "Theoretical Sensitivity" in an own monograph published in 1978 with the help of the term "theoretical coding", a process which he demarcates from "substantive coding". Two different types of codes are linked to these different forms of coding: substantive codes and theoretical codes. [11]
Substantive codes are developed ad hoc during "open coding", the first stage of the coding process, and relate to the empirical substance of the research domain. Theoretical codes which researchers always have to have at their disposal "conceptualize how the substantive codes may relate to each other as hypotheses to be integrated into a theory" (p.72). Theoretical codes are used, in other words, to combine substantive codes to form a theoretical model about the domain under scrutiny. The examples GLASER uses for such theoretical codes are formal concepts from epistemology and sociology which make basic claims about the ordering of the (social) world like the terms causes, contexts, consequences and conditions: by calling certain events (which were coded with the help of substantive codes) as causes and others as consequences or effects the hitherto developed substantive codes can be integrated to a causal model. [12]
In the book "Theoretical Sensitivity" (1978) GLASER presents an extended list of terms which can be used for the purpose of theoretical coding loosely structured in the form of so called theoretical "coding families". Thereby various theoretical concepts stemming from different (sociological, philosophical or everyday) contexts are lumped together, as for example
terms, which relate to the degree of an attribute or property ("degree family"), like "limit", "range", "extent", "amount" etc.,
terms, which refer to the relation between a whole and its elements ("dimension family"), like "element", "part", "facet", "slice", "sector", "aspect", "segment" etc.,
terms, which refer to cultural phenomena ("cultural family") like "social norms", "social values", "social beliefs" etc.,
and 14 further coding families which contain terms from highly diverse theoretical backgrounds, debates and schools of philosophy or the social sciences. Thereby many terms can be subsumed under different "coding families": the term goal, for instance, is part of a coding family referring to action strategies ("strategies family") and also belongs to a coding family referring to the relation between means and ends ("means-goal family"). [13]
Thus GLASER offers an equipment (one dares to say: a hotchpotch) of concepts which are meant to guide the researcher in developing theoretical sensitivity but fails to explain how such terms can be used and combined to describe and explain empirical phenomena. That this task remains extremely difficult and can hardly be achieved by applying single coding families can be easily shown with regard to the first and most important coding family referring to causal relations. The problem with that coding family is that general notions of cause and effect can never sufficiently specify which types of events in a certain domain have to be regarded as causes and which ones are to be seen as effects. Having terms denoting causal relations (like "cause", "condition", "consequence" etc.) at hand is in itself not sufficient for the development of causal models. Using such a coding family one could consider in principle all events as causes and effects which covary to a certain degree. To formulate a causal model about the relation between certain specific events it would be necessary to use at least one substantial (i.e. sociological, psychological ...) category in the development of a causal explanation which provides a clue about which types of events regularly covary. In order to develop theoretical models about empirical phenomena formal or logical concepts (like "causality") have to be combined with substantial sociological concepts (like "social roles", "identity", "culture"). A major problem with GLASER's list of coding families is that it completely lacks such a differentiation between formal and substantial notions. Thus the concept of theoretical coding offers an approach to overcome the inductivism of early Grounded Theory, but its utility for research practice is limited, since it does not clarify, how formal and substantial concepts can be meaningfully linked to each other in order to develop empirically grounded theoretical models. [14]
2.2 STRAUSS' and CORBIN's approach: axial coding and the coding paradigm
In his book "Qualitative Analysis for Social Scientists", published in 1987, Anselm STRAUSS describes a more straightforward and less complicated way how researchers may code empirical data with a theoretical perspective in mind. As with earlier versions of Grounded Theory the analyst starts with open coding "scrutinizing the fieldnote, interview, or other document very closely; line by line, or even word by word. The aim is to produce concepts that seem to fit the data" (STRAUSS 1987, p.28). Thereby STRAUSS notes certain difficulties novices "have in generating genuine categories. The common tendency is simply to take a bit of the data (a phrase or sentence or paragraph) and translate that into a precis of it" (p.29). Such difficulties can be overcome by using the so called "coding paradigm" "especially helpful to beginning analysts" (p.27). It consists of four items, namely "conditions", "interaction among the actors", "strategies and tactics" and "consequences", which can be used explicitly or implicitly to structure the data and to clarify relations between codes. This coding paradigm can be especially helpful during "axial coding" which "consists of intense analysis done around one category at time in terms of the paradigm items" (p.32). [15]
This idea is developed further in "Basics of Qualitative Research", a book written by Anselm STRAUSS and Juliet CORBIN in 1990. Like GLASER, STRAUSS and CORBIN take into account the fact that any empirical investigation needs an explicit or implicit theoretical framework which helps to identify categories in the data and to relate them in meaningful ways. While GLASER had used a list of more or less related sociological and formal terms for that purpose, STRAUSS and CORBIN drew on one general model of action rooted in pragmatist and interactionist social theory (cf. CORBIN 1991, p.36; STRAUSS 1990, p.7) to build a skeleton or "axis" for developing grounded theories. This "paradigm model" is used "to think systematically about data and to relate them in very complex ways" (STRAUSS & CORBIN 1990, p.99) and for determining the main purpose of theory construction: analysing and modelling action and interaction strategies of the actors. Thereby, special emphasis is laid on the intentions and goals of the actors and on the process character of human action and interaction. [16]
Drawing on GLASER's terminology one would regard STRAUSS' and CORBIN's coding paradigm as an elaborated coding family which guides a certain theoretical coding process (called "axial coding" by STRAUSS and CORBIN): categories and concepts, developed during open coding are investigated whether they relate to (1.) phenomena at which the action and interaction in the domain under study are directed, (2.) causal conditions which lead to the occurrence of these phenomena, (3.) attributes of the context of the investigated phenomena, (4.) additional intervening conditions by which the investigated phenomena are influenced, (5.) action and interactional strategies the actors use to handle the phenomena and (6.) the consequences of their actions and interactions. During axial coding the analyst tries to find out which types of phenomena, contexts, causal and intervening conditions and consequences are relevant for the domain under study. If, for instance, social aspects of chronic pain are investigated the researcher may try to identify typical action contexts which are relevant for patients with chronic pain as well as characteristic patterns of pain management strategies. Thereafter it can be examined which pain management strategies are used by persons with chronic pain under certain conditions and in varying action contexts. This may lead to the construction of models of action which capture the variance of the observed actions in the domain under study and which can provide the basis for a theory about action strategies generally pursued in certain situations. [17]
Within this new and refined framework of Grounded Theory methodology, STRAUSS and CORBIN also take a more liberal position concerning the role of literature in the research process, maintaining that "all kinds of literature can be used before a research study is begun ..." (STRAUSS & CORBIN 1990, p.56). [18]
3. The Split Between GLASER and STRAUSS in the 1990s
After having finished their cooperation in joint research projects GLASER and STRAUSS followed different paths in their attempts to elaborate and clarify crucial methodological tenets of Grounded Theory. Thus their approaches vary to a considerable extent. In the year 1992 GLASER turned against STRAUSS' and CORBIN's version of Grounded Theory in a monograph titled "Emergence vs. Forcing: Basics of Grounded Theory Analysis", published in his private publishing venture and written in an exceptionally polemic style. In this book he accuses STRAUSS and CORBIN for having betrayed the common cause of Grounded theory. The charge which is restated in various forms in this book and which represents the crucial thread of GLASER's critique is that by using concepts such as "axial coding" and "coding paradigms" researchers would "force" categories on the data instead of allowing the categories to "emerge". Contrary to STRAUSS and CORBIN, GLASER maintains that researchers following the "true path" of Grounded Theory methodology have to approach their field without any precise research questions or research problems ("He moves in with the abstract wonderment of what is going on that is an issue and how it is handled", GLASER 1992, p.22) and insists that "there is a need not to review any of the literature in the substantive area under study" (p.31). Following GLASER, the application of theoretical background knowledge about the substantive field has to be considered as harmful when developing grounded theories: "This dictum is brought about by the concern to not contaminate, be constrained by, inhibit, stifle or otherwise impede the researcher's effort to generate categories, their properties, and theoretical codes" (ibid.). [19]
GLASER strictly affirms the inductivist rhetoric already put forward in the Discovery book claiming that theoretical insights about the domain under scrutiny would "emerge" directly from the data if and only if researchers free themselves from any previous theoretical knowledge. However, GLASER's version of Grounded Theory takes into account basic problems of inductivism to a certain extent: a strategy of scientific investigation which approaches an empirical domain without any theoretical preconceptions is simply not feasible—such a method would yield a plethora of incoherent observations and descriptions rather than empirically grounded categories or hypotheses. The concepts of theoretical sensitivity and theoretical codes can be seen as attempts to solve this foundational epistemological problem. But theoretical sensitivity, the ability to grasp empirical phenomena in theoretical terms, requires an extended training in sociological theory (cf. GLASER 1992, p.28). Consequently, the "coding families" proposed by GLASER in the book about theoretical sensitivity published in 1978 are of limited help for novices in empirical research who will have serious difficulties to handle the more or less unsystematic list of theoretical terms from various sociological and epistemological backgrounds offered by GLASER. And a researcher with a broad and extended theoretical background knowledge and a longstanding experience in the application of theoretical terms, on the other hand, would certainly not need such a list. [20]
STRAUSS' and CORBIN's concept of a "coding paradigm" serves to explicate the construction of theoretical framework necessary for the development of empirically grounded categories in a much more user-friendly way. By drawing on this concept researchers with limited experience in the application of theoretical knowledge can use Grounded Theory methodology without taking the risk of drowning in the data. One has to be very clear about the fact, however, that the coding paradigm stems from a certain theoretical tradition, which is pragmatist social theory rooted in the works of DEWEY and MEAD. Therefore GLASER's suspicion that an application of the coding paradigm may lead to the "forcing" of categories on the data cannot simply be dismissed. However, if one looks more thoroughly at the conceptual design of STRAUSS' and CORBIN's coding paradigm GLASER's critique seems to be overdrawn: the general theory of action underlying the coding paradigm carries a broad and general understanding of action which is compatible with a wide variety of sociological theories (ranging e.g. from Rational Choice Theory to functionalist role theory or even sociological phenomenology). It can be also argued that the "coding paradigm" to a great extent represents an everyday understanding of purposeful and intentional human action useful for the description of a wide array of social phenomena. However, it must be noted here, that STRAUSS' and CORBIN's coding paradigm is linked to a perspective on social phenomena prevalent in micro-sociological approaches emphasizing the role of human action in social life. Researchers with a strong background in macro-sociology and system theory may feel that this approach goes contrary to their requirements and would be well advised to construct an own coding paradigm rooted in their own theoretical tradition. [21]
GLASER's approach of "theoretical coding" whereby researchers introduce ad hoc theoretical codes and coding families which they find suitable for the data under scrutiny provides a strategy applicable for a greater variety of theoretical perspectives. However, as has been said before following this strategy is much more challenging esp. for novices since it lacks a readymade conceptual framework like STRAUSS' and CORBIN's coding paradigm. However, it is interesting to note that GLASER's work obviously does not suggest a highly pluralistic use of coding families (which would include the use of concepts from macro-sociological approaches) since he seems to share STRAUSS' strong inclination towards action and action theory; at least in his monograph "Theoretical Sensitivity" he asserts that coding and coded incidents have to be related to actions of the actors in the empirical domain. [22]
One of the most crucial differences between GLASER's and STRAUSS' approaches of Grounded Theory lies in the fact that STRAUSS and CORBIN propose the utilization of a specified theoretical framework based on a certain understanding of human action, whereas GLASER emphasises that coding as a process of combining "the analyst's scholarly knowledge and his research knowledge of the substantive field" (1978, p.70) has to be realised ad hoc, which means that it has often to be conducted on the basis of a more or less implicit theoretical background knowledge. Compared to this major dissimilarity, other differences between the two approaches play a minor role. However, GLASER seems to overstate some of them for rhetorical reasons. By highlighting the "emergence" of theoretical concepts from the data he is drawn to exaggerated truth claims: following GLASER the task of empirical research is the discovery of social worlds and "facts" "as they really are". "In grounded theory (...) when the analyst sorts by theoretical codes everything fits, as the world is socially integrated and grounded theory simply catches this integration through emergence" (1992, p.84). Following such claims any attempt of further examination of the "emerged" verities becomes superfluous and a falsification of theoretical statements developed from the data simply would be impossible. This would not only denounce the well established idea (which is now common wisdom in almost any empirical science) that the purpose of empirical research is not to discover unchangeable verities but to tentatively suggest and further corroborate hypotheses, but also the epistemological insight that any empirical phenomenon can be described in various ways and that any object can be described and analysed under different theoretical perspectives. Instead it is suggested that if and only if the analyst frees himself/herself from any theoretical previous knowledge the "emerging" of categories from the data would ensure that only relevant aspects of the phenomena under scrutiny are recognised and described. This in fact represents a dogmatic inductivism prominent in early empiricist philosophy—the conviction put forward for instance by BACON that researchers having cleansed themselves from any theoretical preconceptions and wrong "idols" and thus transformed the mind into a tabula rasa would gain the ability to grasp empirical facts "as they really are". However, GLASER had made clear elsewhere that theoretical concepts do not simply arise from the data alone but through careful "theoretical coding" (that means: by categorizing empirical data on the basis of previous theoretical knowledge). Thus the suspicion arises that the parlance of "emergence" fulfils the function to legitimise a specific style of research: under this perspective the "emergence talk" would not serve the purpose to describe a methodological strategy but would simply offer a way to immunise theories with the help of a methodological rhetoric: following this rhetoric a researcher who follows the "right path" of Grounded Theory cannot go wrong since the concepts have been emerged from the data. [23]
4. Towards a Clearer Understanding of the "Grounding" of Categories and Theories
From its beginnings the methodology of Grounded Theory has suffered from an "inductivist self misunderstanding" entailed by some parts of the Discovery book. Although this inductivism plays a limited role in research practice of many Grounded Theory studies (including those of the founding fathers) it has often lead to confusion especially among novices who draw their basic methodological knowledge from text books. In the past decades Grounded Theory has made considerable progress in overcoming the naïve empiricism of the emergence talk. Thereby the concepts of "theoretical sensitivity", "theoretical coding", "axial coding" and "coding paradigms" represent important steps in the development of an adequate understanding of how qualitative data can be used in the process of developing theoretical categories. Thus one can use Grounded Theory procedures without adhering to the basic "dogmas of empiricism" (QUINE 1951) namely the idea that at a certain stage of the research process a kind of observation and description of empirical phenomena must take place which is not "contaminated" by theoretical notions. However, inductivism still plays a vital role in the image of Grounded Theory for a wider audience as well as in interior methodological discussions, as the previous examples have shown. This leads to the fact that many epistemologically informed social scientists repudiate Grounded Theory after having read writings which seem to reject the trite epistemological fact that there can be no empirical observations "unimpregnated by expectations". [24]
In the following it will be shown that the explicit use and discussion of some concepts nowadays widely discussed and well known in contemporary methodology and epistemology could lead to a better understanding of the nature of empirically grounded theory construction, especially since an implicit use of these concepts already takes place and plays a role in Grounded Theory methodology:
the concept of abductive (or retroductive) inference,
the concept of empirical content or falsifiability,
the concept of corroboration. [25]
4.1 Abductive inference as a logical foundation of theory building
In conceptualising the process of theory generation in empirical research often a wrong alternative is established between an inductivist concept and a hypothetico-deductive (H-D) model of theory generation: according to the H-D model, favoured often by quantitative methodologists, research is seen as a process of hypothesis testing by means of experimental or quasi-experimental strategies. Following this view hypotheses cannot be derived from data, but emerge from the researcher's speculations or happy guesses. The next step of the research process would be rational elaboration of such hypotheses and the operationalisation of their main elements, so that the hypotheses can be tested. Therefore, in the context of the H-D model the researcher has always to develop precise hypotheses before collecting empirical data. Consequently, qualitative research that implies the utilisation of unstructured data and the generation of theories from that material would not be considered a rigorous and valid research strategy from the viewpoint of the H-D model. [26]
However, since the 1970s a number of empirical investigations into the history of science have shown that the H-D model cannot provide an adequate account of the process of numerous scientific discoveries even in the Natural Sciences. As a consequence, a lively discussion about the role of logics of discovery and rational heuristics which has taken place in the modern philosophy of science has challenged the view put forward by proponents of the H-D model that hypotheses emerge through a process which is governed by mere speculation or "happy guesses". Investigations into the history of natural sciences demonstrate that scientific discoveries were in fact not only momentary mental episodes that are not reconstructible as reasoning (cf. HANSON 1965; CURD 1980; NICKLES 1980, 1985, 1990). Although the context of discovery always contains elements of intuition and creativity, the generation of a hypothesis can be reconstructed as a reasoned and rational affair. In one of the most illuminating reconstructions of scientific discoveries Norwood HANSON (1965) utilizes KEPLER's discovery of the planetary orbits to show that logical inferences which lead to the discovery of new theoretical insights are neither inductive nor deductive. Instead they represent a special kind of logical reasoning whose premises are a set of empirical phenomena and whose conclusion is an explanatory hypothesis. [27]
HANSON called this form of reasoning retroductive inference, in more recent writings it has been also called "inference to the best explanation" (ACHINSTEIN 1992). One could also use the term "hypothetical reasoning" which reflects its specific role in the research process: hypothetical inferences serve to discover a hypothesis which explains certain empirical findings. [28]
The earliest concepts of hypothetical reasoning were developed by the pragmatist philosopher Charles Sanders PEIRCE who described a third form of inference apart from deduction and induction which he called "hypothesis" or "abduction". Deductive reasoning is the application of general rules to specific cases to infer a result.
"The so-called major premise lays down this rule; as for example, 'All men are mortal'. The other or minor premise states a case under the rule; as 'Enoch was a man'. The conclusion applies the rule to the case and states the result: 'Enoch is mortal'" (1974/1979, 2.621). [29]
Induction is an inversion of this deductive syllogism—by induction one generalises from a number of cases where a certain result is observed, and infers to general rule, claiming that these results can be observed in all cases of a class which the observed cases belong to. Another way of inverting a deductive syllogism is hypothetical inference which starts with an empirical phenomenon and proceeds to a general statement which explains the observed phenomenon. Thereby the researcher either has a general rule at his disposal that leads to a possible explanation, or the hypothetical inference serves as a means to discover new, hitherto unknown concepts or rules. Often such an "abductive" inference (cf. REICHERTZ 2003) starts by a surprising, anomalous event which cannot be explained on the basis of previous knowledge: "The surprising fact, C is observed. But if A were true, C would be a matter of course. Hence there is a reason to suspect that A is true" (PEIRCE 1974/1979, 5.189). [30]
Confronted with an anomalous event "we turn over our recollection of observed facts; we endeavour so to rearrange them, to view them in such new perspective that the unexpected experience shall no longer appear surprising" (7.36). This is, of course, a creative endeavour which sometimes "comes to us like a flash" (5.182). Nevertheless, the researcher's creativity is limited by certain constraints and methodological rules. First of all, the originality of the newly developed hypotheses is limited by the facts which must be explained. "It is not pure, ontological originality in the relation to the ideas and perceptual facts at hand. Hypotheses can be original, but only if they still may explain the facts in question" (ANDERSON 1987, p.44). Furthermore, an abductive inference must not only lead to a satisfactory explanation of the observed facts but must be related to the previous knowledge of the researcher—"the different elements of the hypothesis were in our minds before", as PEIRCE put it (1974/1979, 5.181). For that reason abductions do not lead to the creation of new knowledge ex nihilo. Instead, every new insight combines "something old and something hitherto unknown" (7.536). Abduction becomes an innovative process by modifying and combining several elements of previous knowledge—"it is the idea of putting together what we had never before dreamed of putting together which flashes the new suggestion before our contemplation" (5.182). Scientific discoveries always require the integration of previous knowledge and new experience "(...) that is to say, we put old ideas together in a new way and this reorganization itself constitutes a new idea" (ANDERSON 1987, p.47). Many of the theoretical insights and developments in sociology which led to new and convincing explanations of social phenomena may be reconstructed as arising from abductive inferences. This esp. relates to so called "middle range theories", as for instance DURKHEIM's idea that differences between suicide rates result from differing levels of "anomia", or WEBER's explanation of the economic success of protestant merchants as a consequence of their religious orientations. The "labelling approach" which attempted to understand "mental illness" or deviance not as an inherent personal quality or attribute of individual actors but as a result of processes of social interaction may serve as another good example. All these theoretical explanations of social phenomena which mark significant theoretical advancements in sociology started with sometimes surprising, anomalous or difficult empirical phenomena which were explained by drawing on theoretical concepts or ideas previously not applied to the domain under scrutiny: thus WEBER related success in worldly affairs to religious beliefs referring to transcendent realities. Or the proponents of the labelling approach interpreted odd or problematic behaviour as a result of interactive processes of role definition and identity formation. In making abductive inferences, researchers depend on previous knowledge that provide them with the necessary categorical framework for the interpretation, description and explanation of the empirical world under study. If an innovative research process should be successful this framework must not work as a Procrustean bed into which empirical facts are forced. Instead, the framework which guides empirical investigations should be modified, rebuilt and reshaped on the basis of empirical material. [31]
4.2 Empirical content or falsifiability as a criterion for the applicability of theoretical preconceptions in qualitative inquiry
Hypothetical inferences combine new and interesting empirical facts with existing theoretical knowledge in a creative way. By no means that does imply that the theoretical knowledge of the qualitative researcher should form in the beginning a fully coherent network of explicit propositions from which precisely formulated and empirically testable statements can be deduced. Rather it should constitute (a sometimes only loosely connected) "heuristic framework" of concepts (or "coding families") which helps the researcher to focus the attention on certain phenomena in the empirical field. But doesn't that mean that theoretical sensible categorising and "coding" of data is merely a gift of charismatic researchers? Can certain aspects of it be made explicit, for instance by determining relevant "theoretical codes" before the data are coded? Is the construction and use of an (at least partly) predefined category scheme a sensible strategy in qualitative analysis or does this necessarily seduce the researcher to go astray so that he/she abandons basic principles of qualitative research, namely the principles of discovering new patterns and relations? [32]
To solve this problem it is helpful to discuss a concept which plays an important role in the writings of Karl POPPER and other traditional proponents of the H-D model: "falsifiability" or "empirical content". This concept is normally used to identify sound scientific hypotheses in a H-D framework. In this context one regards only clear-cut and precisely formulated propositions with empirical content as adequate hypotheses whereas concepts and hypotheses which lack empirical content and thus cannot be falsified are considered as highly problematic. Theoretical concepts with low empirical content, however, can play an extremely useful role if the goal of empirical research is not the testing of predefined hypotheses but the empirically grounded generation of theories, since they do not force data into a Procrustean bed—their lack of empirical content gives them flexibility so that a variety of empirical phenomena can be described with their help. Although such concepts cannot be "tested" empirically, they may be used as heuristic concepts which represent "lenses" through which researcher perceive facts and phenomena in their research field. [33]
Two different types of such heuristic concepts may be used to define a category scheme useable for the structuration and analysis of qualitative data which can be supplemented, refined and modified in the ongoing process of empirical analysis: [34]
The first important type of heuristic concept refers to a variety of theoretical notions, definitions and categories drawn from "grand theories" in the social sciences which are too broad and abstract to directly deduce empirically contentful propositions. Herbert BLUMER invented the term "sensitizing concepts" to describe theoretical terms which "lack precise reference and have no bench marks which allow a clean cut identification of a specific instance" (1954, p.7). Sensitizing concepts are useful tools for descriptions but not for predictions, since their lack of empirical content permits researchers to apply them to a wide array of phenomena. Regardless how empirically contentless and vague they are, they may serve as heuristic tools for the construction of empirically grounded theories. [35]
A concept like "role-expectations" can serve as a good example for that. The assertion that individuals act in accordance with role expectations does not imply a lot of information by itself. This concept may, however, be useful to formulate a variety of research questions for the investigation of different substantive fields: Do role expectations play an important role in the empirical domain under study? What kind of role expectations can be found? By which means do empirical actors try to meet them? Do certain actors develop strategies to avoid the fulfilment of role expectancies? Are such strategies revealed by other actors in the investigated field? Etc. Concepts from so called "utility theory" may serve as another example: at the core of utility theory is the idea that human actors will choose the action which seems the most adequate for the achievement of a desired goal from a set of given action alternatives. However, without specifying which goals the actors pursue and which actions they consider to be adequate, such a proposition has no empirical content. The theory is like an "empty sack" (cf. SIMON 1985), if one does not specify further auxiliary assumptions. Instead of allowing for the development of precise hypotheses utility theory may provide researchers with useful research questions and heuristic codes: qualitative researchers may, for instance, code text segments which refer to the potential costs and benefits that certain actions may have for the actors, they may code segments which relate to the intentions and goals of the research subjects or the means they use to reach their goals etc. In this manner researchers can draw on a wide variety of abstract notions from different theoretical traditions to structure their data. But one should never forget in this process that sticking to certain theoretical tradition makes it easier to structure the data but also carries the risk that concepts are neglected that would suit the data even better and would yield more interesting insights. Even sensitizing and heuristic concepts that capture all kinds of different phenomena may lead to an exclusion of other theoretical perspectives: thus the extended use of concepts with a strong background in micro-sociological action theory (e.g. "actor", "purposes" ...) can preclude a system theory and macro-perspective. [36]
A strategy to cope with that risk (better suited than the waiting for an "emergence" of the most adequate theoretical notions from the data) would be the use of different and even competing theoretical perspectives on the same data. Furthermore, special attention should be paid to the question whether a chosen theoretical concept leads to the exclusion or neglect of certain phenomena and incidents contained in the data. [37]
A second type of categories which do not easily force data but allow for the discovery of previously unknown relations and patterns are categories which relate to general topics of interest covered in the data material. Such topic oriented categories can be often easily found by drawing on general common sense knowledge or on specific local knowledge of the investigated field. Categories like "school", "work" or "family" represent simple examples for that, but topic oriented categories may be far more complex. However, one should always ask the question, as with heuristic theoretical concepts, whether a certain code really serves for heuristic purposes or whether it excludes relevant phenomena from examination. [38]
Both types of heuristic categories, categories developed from common sense knowledge as well as categories derived from abstract theoretical concepts fit various kinds of social reality. That means: it is not necessary to know concrete facts about the investigated domain in order to start using these concepts for data analysis. In other words: heuristic categories cannot be used to construct empirically contentful propositions without additional information about empirical phenomena. This makes them rather useless in the context of an H-D strategy, but it is their strength in the context of exploratory, interpretative research. Regardless whether heuristic categories are derived from common-sense knowledge or from abstract theoretical concepts the following rule is always applicable: with decreasing empirical content the risk that the data are "forced" is diminished. [39]
Thus the epistemological concept of "empirical content" and "falsifiability" can help to identify preconceptions which qualitative researchers (whether they apply Grounded Theory methodology or not) may use to structure the data material while minimising the risk to violate basic methodological concepts of qualitative research. Previous theoretical knowledge can be used at any stage of the process of empirically grounded theory construction if the researchers draw on theoretical concepts with limited empirical content (which the H-D approach would dissuade us to use). Thereby, the researcher may start qualitative analysis by using heuristic concepts and may then proceed to the construction of categories and propositions with growing empirical content. In this process grand theories play the role of a theoretical axis or a "skeleton" to which the "flesh" of empirically contentful information from the research field is added in order to develop empirically grounded categories and propositions. [40]
However, in some cases also the use of categories and assertions with high empirical content can prove to be fruitful in a qualitative study. A researcher investigating the process of care-giving to frail and elderly people, for instance, may discover that Arlie HOCHSCHILD's concept of "emotional labour" (1983) turns out to be helpful in the understanding of phenomena in the research domain. This concept was initially developed to describe typical patterns of action and interactions of flight attendants and air passengers but can be transferred to other domains of social services. Obviously this concept comprises more empirical content than the term "role expectation"—compared to the latter term "emotional labour" cannot be related to any social interaction. There are obviously social interactions which do not require emotional labour, and the assertion that certain service providers are expected to do emotional labour can in principle be falsified. On the other hand, the concept can be rather illuminating in understanding social relations in various fields. [41]
Consequently, it can be sensible in qualitative research to sometimes also use concepts which are closer to the understanding of the term "theory" in H-D research: definite categories and propositions about a certain field of social action that entail enough empirical content to be tested. There is no reason to abstain from such concepts, esp. since their use represents a long and well-established tradition in qualitative research. Researchers and methodologists coming from the "Chicago School" of American sociology had proposed in the 1930s a research strategy named "Analytic Induction" which was used thereafter in many famous qualitative studies. Thereby initial hypotheses are examined and modified with the help of empirical evidence provided by so called "crucial cases". A well-known example comes from Donald CRESSEY's qualitative study about embezzlement. During his research, for instance, he formulated the hypothesis that
"... trust violators usually consider the conditions under which they violated their own positions of trust as the only "justifiable" conditions, just as they consider their own trust violation to be more justified than a crime such as robbery or burglary" (1973, pp.104f.)
—a statement which can in principle be falsified, if one undertakes the effort of collecting data about trust violators. At a certain point in the research process CRESSEY indeed searched systematically for "crucial cases" and "negative instances" of trust violators who saw their trust violations as justifiable. [42]
However, by applying such a research strategy there is always the risk that data are structured with the help of concepts which are not suited for the specific research domain and which do not match the researcher's theoretical interests and orientations. The already mentioned risk that the heuristic concepts employed may contain too much empirical content for the researcher's purposes is already prevalent with STRAUSS' coding paradigm which can draw qualitative researchers towards a certain micro-sociological orientation which they do not necessarily share. On the other hand, the advice to use categories with low empirical content may be unhelpful for inexperienced researchers, since in a given research domain not every heuristic concept can draw the researcher's attention to sociologically relevant phenomena and thus yield insights and interesting results. This danger may arise with GLASER's "coding families": it can be a highly demanding task esp. for novices to select the theoretical concept most suited for a certain research domain among a choice of numerous theoretical schools and approaches. [43]
An important task of qualitative methodology would be to show a middle path between the "Scylla" of forcing the data with preconceived notions and theories not suited for the domain under study and the "Charybdis" of an indiscriminate and eclectic use of concepts from various theoretical traditions. The following methodological strategies can help researchers to avoid either danger:
The development of empirically grounded categories and hypotheses benefits from theoretical pluralism. A pluralistic use of heuristic frameworks requires that researchers have a variety of different concepts with diverging theoretical background at their disposal and obtain a flexible choice among them after having examined their appropriateness for the investigated phenomena. Experts with longstanding experience may be able to choose the right heuristic concept intuitively thereby drawing on rich theoretical background knowledge. In contrast to that novices may benefit from an explicit style of theory building in which different "grand theories" are utilised in order to understand, explain and describe phenomena under study. A systematic comparison of the results from the use of different heuristic concepts is by all means preferable to an "emergence talk" which masks the use of the researcher's pet concepts.
A strategy already suggested by proponents of "Analytic Induction", the systematic search for counter evidence, can reveal whether a given heuristic concept has high or low empirical content. If negative instances are easily found the applied categories obviously have a high degree of falsifiability or empirical content and may be not suited as heuristic concepts which are used in an initial attempt to structure empirical data.
The same holds true for an extensive search for empirical phenomena to which the used categories do not apply. If a variety of phenomena can be identified which cannot be covered by heuristic concepts used so far it is obviously necessary to look for alternative concepts which are suited better to capture the investigated phenomena. [44]
4.3 The necessity of corroboration of empirically grounded categories and hypotheses
Contrary to an inductivist understanding a model of the research process based on "hypothetical" or "abductive inference" is consistently fallibilistic, that means that it does not claim that the validity of propositions developed on the basis of empirical data can be simply ascertained by the fact that the researcher has freed the mind from any preconceptions whatsoever before collecting these data. Hypothetical inferences may lead to rational and well-founded assertions which are both consistent with observed phenomena and with previous theoretical knowledge. If these assertions are not only mere descriptions of observed events but represent theoretical claims they have to be regarded as fallible. The fallibility of any theoretical claim developed on the basis of empirical observation via hypothetical inferences can easily be seen from the fact that often one empirical phenomenon allows for several theoretical explanations which are contradictory but equally compatible with existing stocks of knowledge. [45]
If one abandons the idea that definite and absolute reliable knowledge can be developed from empirical data via induction and if one explicitly acknowledges the role of previous theoretical knowledge in the research process one must also not consider the demand to further corroborate empirically grounded theoretical concepts as an attempt to downplay or underestimate the role of exploratory inquiry compared to methods of (experimental or quasi-experimental) hypothesis testing. This requirement rather represents a matter of course given the methodological fact that empirical research can never provide a final proof for theoretical propositions but only cumulative and always provisional evidence. Whereas STRAUSS and CORBIN pay a lot of attention to the question how grounded categories and propositions can be further validated, GLASER's concept shows at least a gleam of epistemological fundamentalism (or "certism", LAKATOS 1978) especially in his defence of the inductivism of early Grounded Theory. "Grounded theory looks for what is, not what might be, and therefore needs no test" (GLASER 1992, p.67). Such sentences carry the outmoded idea that empirical research can lead to final certainties and truths and that by using an inductive method the researcher may gain the ability to conceive "facts as they are" making any attempt of further corroboration futile. [46]
If one does not want to adventure on claiming infallibility for particular results of empirical research the further examination, modification and rejection of empirically grounded hypotheses become an important issue. One may not only draw on STRAUSS' and CORBIN's more current writings about the methodology of Grounded Theory for that but can also use many concepts developed throughout the history of qualitative research, e.g. the already mentioned strategy of "Analytic Induction", procedures for the examination of hypotheses in hermeneutic text interpretation in which different "Lesarten" (reading versions) of the same text passage are corroborated through sequential analysis of additional text (OEVERMANN, ALLERT, KONAU & KRAMBECK 1979) or methods for developing and testing causal hypotheses in qualitative research proposed by Charles RAGIN (1987). Techniques developed in the past two decades for a computer-assisted categorisation, archiving and structuration of qualitative data can also support the process of further grounding theoretical concepts in the data by systematically searching for empirical evidence and counter-evidence (KELLE 2004). [47]
"Emergence" has turned out to be a rather problematic methodological concept which reflects the empiricist idea that researchers can and must approach data with no theories in mind. However, GLASER and STRAUSS did not overlook the fact that researchers always have to draw on their existing theoretical knowledge in order to understand, describe and explain empirically observed social phenomena. An alternative to an inductivist understanding of qualitative research can already be found in the Discovery book: the researcher's "theoretical sensitivity" provides a "perspective that will help (him) see relevant data and abstract significant categories from his scrutiny of the data" (1967, p.3). Thus the earliest version of Grounded Theory contained two different concepts concerning the relation between data and theory with conflicting implications: on the one hand the idea is stressed that theoretical concepts "emerge" from the data if the researcher approaches the empirical field with no preconceived theories or hypotheses, on the other hand the researcher is advised to use his or her previous theoretical knowledge to identify theoretical relevant phenomena in the data. [48]
Much of GLASER's and STRAUSS' later methodological work can be understood as attempts to further develop the concept of theoretical sensitivity in order to reconcile these prima facie divergent ideas. Thereby STRAUSS proposes the use of a general theory of action to build an axis of the emerging theory. GLASER, although he had fully repudiated STRAUSS' concepts in 1992, proposed a similar idea in 1978: theoretical codes represent those theoretical concepts which the researcher has at his or her disposal independently from data collection and data analysis. Thus the controversy between GLASER and STRAUSS boils down to the question whether the researcher uses a well defined "coding paradigm" and always looks systematically for "causal conditions", "phenomena", "context", "intervening conditions", "action strategies" and "consequences" in the data, or whether he or she should employ theoretical codes ad hoc, thereby drawing on a huge fund of "coding families". [49]
Both strategies have their pros and cons: novices who wish to get clear advice on how to structure data material may be satisfied with the use of the coding paradigm. Since the paradigm consists of theoretical terms which carry only limited empirical content the risk is not very high that data are forced by its application. However, it must not be forgotten that it is linked to a certain micro-sociological perspective. Many researchers may concur with that approach esp. since qualitative research always had a relation to micro-sociological action theory, but others who want to employ macro-sociological and system theory perspective may feel that the use of the coding paradigm would lead them astray. [50]
Experienced researchers with a broad knowledge in social theory would clearly benefit from the advantages of theoretical coding—having at their disposal not only one possible axis of the developing theory but being able to construct such an axis by themselves through the combination of theoretical concepts from different schools of thought. But regardless of which types of "theoretical codes" or "coding paradigms" are applied empirically grounded theory building should always be guided by an adequate epistemological understanding of the relation between data and theory. Thereby it is of utmost importance to abandon inductivist rhetoric and to develop a clear understanding of the role of inductive and abductive inferences in the process of empirically grounded theory generation. Furthermore the insight must be stressed that any scientific discovery requires the integration of previous knowledge and new empirical observations and that researchers always have to draw on previous theoretical knowledge which provides categorical frameworks necessary for the interpretation, description and explanation of the empirical world. [51]
To make sure that by applying theoretical knowledge one does not force data into a Procrustean bed one needs to thoroughly differentiate between diverse types of theoretical statements (namely between definite and precise hypotheses on the one hand and broad and general heuristic concepts on the other hand) and their differing role in the process of theory generation. Empirically grounded theory building starts by making a careful choice among a variety of concepts with diverging theoretical backgrounds after having examined their appropriateness for the investigated phenomena. By using such a heuristic framework as the axis of the developing theory one carefully proceeds to the construction of categories and propositions with growing empirical content. This should be accompanied by a meticulous search for negative instances and for empirical phenomena to which the used heuristic categories do not apply and which would call for their reformulation or abandonment. This style of inquiry should be supplemented by strategies of further corroboration of the empirically contentful categories and propositions developed in the ongoing course of theory building. [52]
Achinstein, Peter (1992). Inference to the Best Explanation: Or, Who Won the Mill-Whewell Debate? Studies in the History and Philosophy of Science, 23, 349-364.
Anderson, Douglas R. (1987). Creativity and the Philosophy of C.S.Peirce. Dordrecht: Martinus Nijhoff.
Blumer, Herbert (1954). What is Wrong with Social Theory? American Sociological Review, 19, 3-10.
Chalmers, Alan F. (1999). What is this Thing Called Science? Maidenhead: Open University Press.
Corbin, Juliet (1991). Anselm Strauss: An Intellectual Biography. In David R. Maines (Eds.), Social Organization and Social Process. Essays in Honor of Anselm Strauss (pp.17-44). New York: Aldine de Gruyter.
Cressey, Donald R. (1950). The Criminal Violation of Financial Trust. American Sociological Review, 15, 738-743.
Cressey, Donald R. (1953/1971). Other People's Money. A Study in the Social Psychology of Embezzlement. Belmont: Wadsworth.
Curd, Martin V. (1980). The Logic of Discovery: An Analysis of Three Approaches. In Thomas Nickles (Ed.), Scientific Discovery, Logic and Rationality (Boston Studies in the Philosophy of Science, Vol. LVI, pp.201-219). Reidel: Dordrecht.
Glaser, Barney (1978). Theoretical Sensitivity. Advances in the Methodology of Grounded Theory. Mill Valley, CA: The Sociology Press.
Glaser, Barney (1992). Emergence vs. Forcing: Basics of Grounded Theory Analysis. Mill Valley, CA: Sociology Press.
Glaser, Barney & Strauss, Anselm (1967). The Discovery of Grounded Theory: Strategies for Qualitative Research. New York: Aldine de Gruyter.
Hanson, Norwood Russell (1965). Patterns of Discovery. An Inquiry Into the Conceptual Foundations of Science. Cambridge: Cambridge University Press.
Hochschild, Arlie (1983). The Managed Heart. Commercialization of Human Feeling. Berkeley, CA: University of California Press.
Kelle, Udo (1995). Theories as Heuristic Tools in Qualitative Research. In Ilja Maso, Paul A. Atkinson, Sarah Delamont & Jef C. Verhoeven (Eds.), Openness in Research. The Tension Between Self and Other (pp.33-50). Assen: Van Gorcum.
Kelle, Udo (2004). Computer Assisted Qualitative Data Analysis. In David Silverman, Giampetro Gobo, Clive Seale & Jaber F. Gubrium (Eds.), Qualitative Research Practice (pp.473-489). London: Sage.
Kelle, Udo; Marx, Janine; Pengel, Sandra; Uhlhorn, Kai & Witt, Ingmar (2002). Die Rolle theoretischer Heuristiken im qualitativen Forschungsprozeß – ein Werkstattbericht. In Hans-Uwe Otto, Gertrud Oelerich & Heinz-Günter Micheel (Eds.), Empirische Forschung und Soziale Arbeit. Ein Lehr- und Arbeitsbuch (pp.11-130). Neuwied und Kriftel: Luchterhand.
Lakatos, Imre (1978). The Methodology of Scientific Research Programmes. Cambridge: Cambridge University Press.
Laudan, Larry (1977). Progress and its Problems. Towards a Theory of Scientific Growth. London and Henley: Routledge & Kegan Paul.
Nickles, Thomas (ed.) (1980). Scientific Discovery, Logic and Rationality (Boston Studies in the Philosophy of Science, Vol. LVI). Reidel: Dordrecht.
Nickles, Thomas (1985). Beyond Divorce: Current status of the discovery debate. Philosophy of Science, 52, 177-206.
Nickles, Thomas (1990). Discovery Logics. Philosophica, 45, 732.
Oevermann, Ulrich; Allert, Tilmann; Konau, Elisabeth & Krambeck, Jürgen (1979). Die Methodologie einer "objektiven Hermeneutik" und ihre allgemeine forschungslogische Bedeutung in den Sozialwissenschaften. In Hans-Georg Soeffner (Ed.), Interpretative Verfahren in den Sozial- und Textwissenschaften (pp.352-434). Stuttgart: Metzler.
Peirce, Charles S. (1974, 1979). Collected Papers. Published by Charles Hartshore, Paul Weiss and Arthur Burks. Cambridge, MA: The Belknap Press of Harvard University Press.
Quine, Willard Orman von (1951). Two Basic Dogmas of Empiricism. The Philosophical Review, 60, 20-43.
Ragin, Charles (1987). The Comparative Method. Moving beyond Qualitative and Quantitative Methods. Berkeley, CA: University of California Press.
Reichertz, Jo (2003). Die Abduktion in der qualitativen Sozialforschung. Opladen: Leske und Budrich.
Simon, Herbert A. (1985). Human Nature in Politics: The Dialogue of Psychology with Political Science. The American Political Science Review, 79, 293-304.
Strauss, Anselm L. (1987). Qualitative Analysis for Social Scientists. Cambridge, NY: Cambridge University Press.
Strauss, Anselm L. (1990). Creating Sociological Awareness. New Brunswick: Transaction Publ.
Strauss, Anselm L. & Corbin, Julliet (1990). Basics of Qualitative Research. Grounded Theory Procedures and Techniques. Newbury Park, CA: Sage.
Udo KELLE has been recently appointed as Professor for Social Research Methods at the Department of Social Sciences and Philosophy at the University of Marburg. He has written various books and articles about the methodology of qualitative research and about the integration of qualitative and quantitative methods (e.g. "Computer-aided Qualitative Data Analysis", London, Sage 1995 or (with Susann KLUGE) "Vom Einzelfall zum Typus", Opladen, Leske and Budrich 1999, or "Methodeninnovation in der Lebenslaufforschung", Opladen, Juventa 2001). His main research interests cover the methodological and philosophical background of social research methods and the connections between sociological theory and data in empirical research. His current work involves extensive research in the field of Mixed Methods Designs and their application in the field of sociological life course research.
Contact:
PD Dr. Udo Kelle
Institute for Sociology
Philipps-University of Marburg
Ketzerbach 11
35032 Marburg, Germany
E-mail: udo.kelle@staff.uni-marburg.de
Kelle, Udo (2005). "Emergence" vs. "Forcing" of Empirical Data? A Crucial Problem of "Grounded Theory" Reconsidered [52 paragraphs]. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 6(2), Art. 27, http://nbn-resolving.de/urn:nbn:de:0114-fqs0502275.
Revised 6/2012