Post date: Sep 15, 2012 5:54:3 PM
Here, from latest to earliest, are the posts from an RM-NET exchange on causality. Towards the end, 2nd year graduate Rohny Saylors frames the discussion as a case of important people in the field using "rhetorical moves" to bully him, a hapless victim. Which is the quite the rhetorical move of his own. Once it is framed this way, it is no longer morally acceptable to disagree with him, which pretty much ends the useful part of the discussion.
In his last posting Rohny wrote that "I must admit I find the rhetorical move of insulting those in the fields of cultural anthropology and sociology as liars quite intimidating." It's indeed unfortunate that he misinterpreted what has been said about qualitative research and qualitative researchers. A review of the many postings shows quite clearly that (a) nobody insulted anthropologists or sociologists, (b) nobody called people in these fields liars, and (c) nobody tried to intimidate Rohny. Our discussion focused on the relative value of qualitative research and the many inferential errors that stem from poorly designed studies.
I'm confident that almost all so called quantitative researchers recognize the value of qualitative research as a means of generating inductive inferences about real-world phenomena. There are many examples of sound research of this type. What is at issue is the ability of qualitative research to provide credible evidence about cause-effect relations. Clearly, it is not a sound strategy for providing such evidence.
Eugene F. Stone-Romero
----- Original Message -----
From: rohny saylors
To: RMNET
Sent: Saturday, September 15, 2012 9:56 AM
Subject: Re: Causality in qualitative research
Hi
As a Ph.D. student starting the second year in my program, I am concerned that such eminent scholars are saying that, in trying to express perspectives from cultural anthropology and sociology, I'm in the camp of religious-sophist, data fudge-ers, and out-right liars.
Nothing has addressed the counter-factual that I presented: No one is appealing to God to tell you to accept the concept of terse-narratives, inductive iteration between idea formation and updated belief doesn't change the final observation that Disney uses the unwarranted auspice of wholesomeness to take from the American people their former-right to public domain, and the while I don't think Dr. Boje was faking his data in either case I think that the obviousness of these findings in retrospect and the immense amount of utilization of the underlying process-theory derived from the findings helps in having strong belief in said findings, no matter their origins.
I honestly don't understand how religious-sophistry, data fudging, or lying about findings has anything to do with showing two clear cases of qualitative-findings of process-causality that defy lab-experimentation. However, I must admit I find the rhetorical move of insulting those in the fields of cultural anthropology and sociology as liars quite intimidating: Particularly since not only are the three of you imminent in your own right, but you must also represent a large cone of folks within the academy that I hope to converse with.
-Rohny
On Sat, Sep 15, 2012 at 12:34 AM, John Antonakis <> wrote:
Hear, hear.
As the eminent Harvard statistician, Frederick Mosteller said, "While it is easy to lie with statistics, it is even easier to lie without them."
What Tom and Gene say, about "making it up" when using a qualitative protocol is a what my co-authors and I were saying, in a more polite way, in our causal claims paper, where we said: "and we will not get into another limitation of observer, particularly participant–observer, confirmation bias [Nickerson, 1998]."
Of course, everyone can fudge anything; however, it is becoming easier to catch quantitative "fudgers". I am keenly following the work of Uri Simonsohn in this regard, see:
Dietz and I also used Monte Carlos to show how the results of a study simply did not stack-up (this paper did not report fraudulent data--just very opaque and weak/incorrect methods, and which conveniently "excluded" important covariates that were gathered; they also chopped up data in using "extreme groups" analysis, which is known to engender specious findings); once correctly analyzed, we showed that the claims made were incorrect:
Antonakis, J., & Dietz, J. (2011). Looking for Validity or Testing It? The Perils of Stepwise Regression, Extreme-Scores Analysis, Heteroscedasticity, and Measurement Error. Personality and Individual Differences, 50(3), 409-415.
http://dx.doi.org/10.1016/j.paid.2010.09.014
Antonakis, J., & Dietz, J. (2011). More on testing for validity instead of looking for it. Personality and Individual Differences, 50(3), 418-421.
http://dx.doi.org/10.1016/j.paid.2010.10.008
And, how we started our paper, says it all:
"Tavris and Aronson (2007, p. 108) noted that ‘‘the scientific method consists of the use of procedures designed to show not that our predictions and hypothesis are right, but that they might be wrong”.
I think that the above two papers can be very interesting for teaching material in methods classes. Interestingly, during the publication of our first paper, the authors withdrew their measure (we were not exactly told why, but we think it was because of our analyses). Those interested may want to look at the original paper, which on prima facie reading seems to make reasonable conclusions:
Warwick, J., Nettelbeck, T., & Ward, L. (2010). AEIM: A new measure and method of scoring abilities-based emotional intelligence. Personality and Individual Differences, 48(1), 66-71.
http://dx.doi.org/10.1016/j.paid.2009.08.018
Such kind of citizen policing (and reviewer policing too) is very difficult and probably impossible to do with purely qualitative papers.
Regards,
John.
Prof. John Antonakis
Faculty of Business and Economics
Department of Organizational Behavior
University of Lausanne
Internef #618
CH-1015 Lausanne-Dorigny
Switzerland
http://www.hec.unil.ch/people/jantonakis
Associate Editor
The Leadership Quarterly
On 15.09.2012 03:02, Becker, Thomas E wrote:
I have to agree with Gene. It makes me nervous when I hear someone talking about "telling a story" rather than reporting the results of their research. I always suspect that the story they're telling may be at least partially fiction - for example, perhaps they are omitting data that are inconsistent with their story, over-emphasizing data that are consistent with their story, re-ordering studies to make their story sound more logical or theory-driven than it was, or making up hypotheses post-hoc to make their story more coherent. Of course, this can happen with either qualitative or quantitative research. Either way, in my view, it is unethical and works against good science. However, it may increase your chances of publication.
Tom
Thomas E. Becker, Ph.D.
Chairperson and Professor
Dept of Business Administration
University of Delaware
Newark, DE 19716-2710
Phone: (302) 831-6822
From: Gene & Dianna [wolfcub1@satx.rr.com]
Sent: Friday, September 14, 2012 4:22 PM
To: RMNET
Subject: Re: Causality in qualitative research
Centuries ago, the Catholic church held that the earth was the center of the universe. It had to be so because the church said it was. This was their view of reality. However, it was far from valid.
In more recent times the so called "creationists" have argued that God created the world. This is their own chimerical "view" of reality. Tenaciously holding onto their beliefs the creationists have tried to stop the teaching of evolution in the public schools. Regrettably, they are still doing it in Texas. Notwithstanding their views, their is not one bit of credible evidence to support their "causal" (better yet, casual) beliefs.
Neither of the above-noted views provide a sound basis for informing "organizational, public, or social policy." Quite to the contrary, policy based on such views would be quite destructive and counterproductive.
In short, I'm not convinced that qualitative research provides a sound basis for adducing evidence about cause-effect relations. Anyone can "tell a story." However, the stories that are told by people need not have any connection with the reality underlying various processes. To argue that the stories are useful because they are "context-based" is little more than sophistry!
Best regards,
Gene
Eugene F. Stone-Romero
----- Original Message -----
From: rohny saylors
To: RMNET
Sent: Friday, September 14, 2012 3:00 PM
Subject: Re: Causality in qualitative research
"I welcome one or more examples of such a study."
Past constructions of social narratives are evoked using terse narratives "x y z, but you know the story ..." in order to cause the social meaning of the story to influence the existing context. This is one example of process causality that has been derived using an anthropological (and dare I say post-modern) lens.
Boje, D. M. (1991). The storytelling organization: A study of story performance in an office-supply firm. Administrative science quarterly, 106-126.
"That does not mean that causal attributions can be accurately made"
Yes, and it does not mean that causal attributions will be inaccurately made.
"Accurate causal attributions are not likely to be derived from these qualitative methods."
What you may be missing is that, what each of the individuals say was causal is what was causal, as it relates to the dependent variable of "how they see things" which is directly related to the function of "resolving the issue". In their world, by their logic (and the logic of those they influence) this is causality. Getting at what "really" happened (from a God's eye view) may even be irrelevant because not much of what "really" happened (from a God's eye view, except the co-social construction of retrospective multiple realities) much influences the situation. (not to say understanding how and why things diverge from a God's eye view is useless from one point of view; process is simply about a different set of questions). The solution, of course, is to get everyone to see what happened from other parties' perspectives; instead of finding "the truth" that a social-psychologist is looking for, we create an opportunity for greater understanding.
"Different realities are NOT equally valid."
No, but no 'reality' is ever purely valid, and very few are without any validity. If you assume there is a-singular-reality that is valid then you are in violation of others' assumptions that say that there is not; both parties hold these contrary assumptions for good, logical, consistently applied reasons. (If you hold, uncritically, to either side, it is easy to dismiss entire worlds of empirical knowledge because they disagree with your closely held assumptions). Not all opinions are equal; but how those opinions differentially influence the realities that have been socially constructed (and which people must navigate) is a question entirely different from the question of "what is the underlying reality here." Indeed, even if there was some fact underlying it all, the truth of it is likely poorly correlated with the realities of the world in which people interact (though why and how they diverge is an interesting question and a potential play-ground for multiple-method-multiple-paramedic research)
"socially constructed self-esteem"
Our ideas regarding what bio-physical reactions mean are socially constructed: The problem is that the attempt at changing social-construction was done poorly because it failed to recognize the historic social-system in which this construction was located. This is not a disproof of the social-construction theory, but it is a counter-factual which may update prior probability and require that we add a more historical-materialist perspective to our understanding of the social construction of reality (or in some other way address assumptions in the theory).
"These qualitatively based opinions do not provide a basis on which to make organizational, public, or socia'i policy." Of course not, if we assume that a social psychological model must be used as said basis. If we do not assume that, but rather allow for anthropological and sociological assumptions as said basis instead, then they do indeed provide a basis. Here is an example of providing a basis on which to make organizational, public or social policy: but which is unlikely relevant to lab studies.
Social privileging of the elite narrative of Disney has caused the marginalized voices and excluded stories of the darker side of Disney to be ignored, thus creating a halo effect around a company that essentially brought mass-production to the cartoon industry. This halo effect around Mickey Mouse is why we do not see major works going into the public domain X number of years after the IP creator's death, because as we approach the new X number of years after Walt's death, the congress extends the time so that our beloved Mickey Mouse is never "lost" to the public domain. That Disney disenfranchised and created a new-speak Orwellian environment around its turning the trade-skill of cartoon-artist into a production-facility, where what are essentially factory workers fill in the space between key-frames, has caused this mega-corporation to have the social-cloak of 'wholesomeness' that allows it to influence our politics and take from the people part our freedom of public-domain.
Abstract:
Disney is the cause of our loss of a social freedom, it works through spinning a narrative
of 'wholesomeness' that covers up other readings of its history that might keep it from
being as influential.
.. This is a finding of causality that is a single-case with major influence; you cannot do a
quasi-experimental design on it, you can't get the actors in a lab, you may be able to
add some level of legitimacy to the findings with quantitative data. However, since we
do know the causality without it, said numbers would be a rhetorical whitewash to gain
the respect of people that come at the world from the perceptions of positivism: but not
actually illuminating regarding the causality here.
Boje, D. M. (1995). Stories of the storytelling organization: a postmodern analysis of
Disney as" Tamara-Land". Academy of Management Journal, 997-1035
I accept that this may not be either useful or interesting to people from a social-psych paradigm; but it is valuable as a basis on which to make organizational, public, or social policy
-Rohny
" If it is indeed possible to make causal inferences on the basis of qualitative research, a qualitative researcher ought to be able to provide us with an example (real or contrived) of a qualitative study that provides a basis for making causal claims. I welcome one or more examples of such a study. "
On Fri, Sep 14, 2012 at 6:32 AM, Gene & Dianna <wolfcub1@satx.rr.com> wrote:
If it is indeed possible to make causal inferences on the basis of qualitative research, a qualitative researcher ought to be able to provide us with an example (real or contrived) of a qualitative study that provides a basis for making causal claims. I welcome one or more examples of such a study.
Best regards,
Gene
Eugene F. Stone-Romero
----- Original Message -----
From: rohny saylors
To: RMNET
Sent: Thursday, September 13, 2012 4:33 PM
Subject: Re: Causality in qualitative research
"however, qualitative research cannot inform us in a precise manner, and within some probabilistic framework, about the causes of effects."
"Simply because the previous statement is true does not mean that one can necessarily test causal claims using observational qualitative data."
"at the end of the day we can make no causal claims from purely qualitative observations"
I agree with you entirely given the assumptions you are making about what allows us, in the first place, to make causal claims. However, when answering the 'how' question of sociological/anthropological process theory instead of the 'why' question of psychological theory, causal claims from purely qualitative observations is not only possible but usually the right way.
You have noted that just because there are limitations to the positivist-scientific method that does not make my argument right, and that is very true. What I am trying to convey is that there is a world of sociologists and anthropologists that create rigorous, consistent, useful knowledge in a way that violates the psychological world-view of what allows us to make causal claims and does allow for an understanding of causality in what you might consider a boundary condition for causal knowledge.
This is entirely different than trying to use the idea "different methods serve for different tasks." because you are right! From a positivist world-view qualitative research is the induction phase, which allows us to create theory which is then tested with hypothesis and validated with replication. From a more phenomenological point of view, induction serves an entirely different purpose, because it assumes the proper role of empirical study is a deep thick description of experience.
"There must be very sound reasoning, logic, and assumptions behind espoused methods and practice"
With this fundamental principle, I think (almost) all of us agree. It is simply that other fields have very sound reasoning, logic, and assumptions that are not the same as the sound reasoning, logic, and assumptions that come out of social-psych.
-Rohny
On Thu, Sep 13, 2012 at 8:32 AM, John Antonakis <John.Antonakis@unil.ch> wrote:
Hi Mike:
Man, the peg is being banged hard!
:-)
Best,
J.
__________________________________________
Prof. John Antonakis
Faculty of Business and Economics
Department of Organizational Behavior
University of Lausanne
Internef #618
CH-1015 Lausanne-Dorigny
Switzerland
http://www.hec.unil.ch/people/jantonakis
Associate Editor
The Leadership Quarterly
__________________________________________
On 13.09.2012 02:38, Michael James Zyphur wrote:
> Hi John,
> Thanks for your email. I appreciate your position. It sounds like there are different ways of understanding causes, effects, and cause-and-effect. These different ways will be as useful as their fit to the task at hand, and taking a read of the daily news, there are many types of tasks to which we organizational scholars can be put to work.
>
> For example, I won't be caught any time soon bringing questionnaires into an industrial labor dispute to figure out who caused what and when and why. Collecting qualitative data to figure out how the involved parties are making meaning in their situation is what I'd do. Cause and effect in this case cannot be separated from the parties' interpretations of the situation, which is multiple and shifting. Indeed, to call them interpretations is, perhaps, not as useful as thinking about "the situation" and "the parties" as a single morphing entity that can't be pinned down very simply. Traditional "y = bx + e" logic goes out the window, as such formalisms often do when the rubber meets the road with complex social systems where "y" and "x" cannot be defined in one way and the notion of a single and stable causal system with some parameter "b" is not going to help.
>
> Alternatively, for making general inferences about social systems that appear relatively stable, and where system features conform pretty well to the notion of discrete variables, then I'm very much with you and on board with "y = bx + e", if such formalism is what's called for. For example, looking for trends in phenomena over time can be greatly facilitated by formal modeling with numbers, as can looking for trends in any massive dataset. Yet, these methods will gloss over the particular and really only work for things that can be fit into a "y = bx + e" logic and not lose what's interesting about them. Restricting the notion of cause and effect for such phenomena and for those people who work with such formal models will prop up one way of thinking about and applying cause-and-effect, but this seems hegemonic and, most worrying to me, possibly limiting for our field and our ability to be helpful to the world as organizational researchers.
>
> It is this very last point that sticks out most for me when it comes to the evaluation of method, validity, rigor, and cause-and-effect. What a glorious burden we have!
>
> Thanks for the discussion,
> mike
>
>
> ________________________________________
> From: John Antonakis [John.Antonakis@unil.ch]
> Sent: Thursday, September 13, 2012 9:15 AM
> To: RMNET
> Subject: Re: Causality in qualitative research
>
> Hi Michael:
>
> I am espousing diversity of methods; and, different methods serve for
> different tasks. Thus, because individuals have different mindsets that
> may collide at times (guess where I got that from!) does not justify
> sticking square pegs in round holes.
>
> Sure, one has the right to say what one wants, make whatever claims one
> wants, and use whatever methods one wants, and in the process defend
> one's identity as a researcher; however, there must be very sound
> reasoning, logic, and assumptions behind espoused methods and practices.
> There are things that qualitative research can do that quantitative
> research cannot. Likewise, there are things that quantitative research
> can do (viz. examine causal claims) that qualitative research cannot. I
> am happy to see the methods co-exist and do what they were intended to do.
>
> Best,
> J.
>
> __________________________________________
>
> Prof. John Antonakis
> Faculty of Business and Economics
> Department of Organizational Behavior
> University of Lausanne
> Internef #618
> CH-1015 Lausanne-Dorigny
> Switzerland
> Tel ++41 (0)21 692-3438
> Fax ++41 (0)21 692-3305
> http://www.hec.unil.ch/people/jantonakis
>
> Associate Editor
> The Leadership Quarterly
> __________________________________________
>
> On 13.09.2012 00:37, Michael James Zyphur wrote:
>> Hi y'all,
>> It sounds like Mandy and Rohny are talking about very different things than John and Gene. The quantitative approach common in management research derives from the physical sciences, where discrete and definable variables with constant meanings that can be used to uncover causal effects are helpful. Such a way of thinking about the world evolved not for the study of social phenomena, but for the study of biological and other physical systems. As Mandy notes, unlike the objects for which such methods were designed, in social systems people generatively make meaning in context in a dynamic and interactive fashion, meaning that "variables" may not be very useful concepts for social inquiry and that there may not even be a single external physical context that could be usefully quantified because its meaning is subject to change. Quantifying discrete variables to try and capture complex social processes might ruin what's most interesting about them, so a qualitative approach to describi
>> ng relations among actors and environments may work well and using a causal language is appropriate when it fits with the guiding epistemology of those who are doing the study. Alternatively, for other kinds of systems and questions, the brand of causality and quantification John and Gene are espousing can work well and is surely appropriate on their epistemic terms to answer their questions. This approach relies on formal models that miss the richness of a complex world, but simplify in ways that can be very useful for prediction on the terms used to build and interpret the models.
>>
>> The solution here doesn't seem to be trying to get the other side to adopt just one logic so that we can all agree. This doesn't respect diversity and limits the ways we would be allowed to study social systems, which is to say we would be restricted in the way we're allowed to think about and experience our worlds. Instead, some stock taking of which epistemology and method is good for which kinds of research questions and research goals would probably be most helpful (including how our methods help determine how our social science interfaces with society generally, which has implications for our relevance as a field). This will require open exchange and dialogue. Along the way, we should probably watch out for adherence to a single epistemology in a kind of "one size fits all" fashion. If you've ever purchased clothes making such claims then you know that this can leave you with your pants around your ankles ;-)
>>
>> Cheers
>> mike
>>
>>
>>
>>
>> ________________________________________
>> From: John Antonakis [John.Antonakis@unil.ch]
>> Sent: Thursday, September 13, 2012 7:08 AM
>> To: RMNET
>> Subject: Re: Causality in qualitative research
>>
>> Dear Mandy:
>>
>> Complex entities/meanings or what have you (entities for short) can
>> naturally occur; complex entities can naturally co-occur too. Whether
>> one quantifies what one observes or whether one writes up narratives of
>> what one observers is not of issue here.
>>
>> If one observes and describes a co-occurrence between the two entities,
>> can one know really know whether entity A causes entity B? There could
>> be a confound, a complex entity C, lurking "out there", that can also be
>> narratively described bit unbeknown to the observer it is C that causes
>> both A and B. If that were the case, A and B are both consequences of C
>> (and thus A does not cause B). A well intentioned and keen observer may
>> be able to construct a very convincing narrative that A causes B.
>> However, this observer will never know if A truly causes B. Because
>> causality has to do with discovering causes (A) of effects (B), to know
>> that A causes B, one must either manipulate A or ensure quantify what
>> one measure to rule out that the covariation between A and B is not
>> caused by a third variable (C).
>>
>> Of course, qualitative research can help us to understand a phenomenon
>> by generating ideas about what variables we should be studying and how
>> they unfold, or possibly how they may be related; however, qualitative
>> research cannot inform us in a precise manner, and within some
>> probabilistic framework, about the causes of effects. And, yes, I think
>> that qualitative research can help shed light on many things.
>>
>> By the way, I read the below paper in the Lancet and it has do with the
>> problem "measuring" of tacit knowledge; I also read another paper in the
>> Lancet by Malterud: (Malterud, K. (2001). Qualitative research:
>> standards, challenges, and guidelines. The Lancet, 358(9280), 483-488.
>> doi: 10.1016/s0140-6736(01)05627-6); I did not find any discussion about
>> "qualitative" causality in there either.
>>
>> Regards,
>> John.
>>
>> __________________________________________
>>
>> Prof. John Antonakis
>> Faculty of Business and Economics
>> Department of Organizational Behavior
>> University of Lausanne
>> Internef #618
>> CH-1015 Lausanne-Dorigny
>> Switzerland
>> Tel ++41 (0)21 692-3438
>> Fax ++41 (0)21 692-3305
>> http://www.hec.unil.ch/people/jantonakis
>>
>> Associate Editor
>> The Leadership Quarterly
>> __________________________________________
>>
>> On 12.09.2012 20:29, Mandy Lee wrote:
>>> Hmmm. I am a little wary of the kind of generalised claims that say causal inferences are "virtually never justified" in qualitative research.
>>>
>>> Maybe in organisational psychology and other fields where causal explanations are taken to mean only demonstrable relationships between quantifiable variables, then this sort of judgement about qualitative research would be entirely valid.
>>>
>>> Yet in organisational sociology that places primary importance on social context and social meaning as drivers of organisational action (such as those who subscribe to symbolic interactionism and other interpretivist paradigms in organisation studies), causal explanations at the micro (interpersonal interactions) and meso (communities of practice) levels can be uncovered using qualitative research, and in fact can *only* be uncovered using qualitative research, because it is the only kind of methodologies that take into account subjective meanings that actors hold. Experimental methods are of little use to uncover causal relationships that are already naturally occuring; it also assumes several things that may be challenging to hold without question in organisation studies: such as that organisational actors can be safely reduced from reflexive, autonomous agents to stimulus-response automatons for experimental design purposes; such as that one can replicate social reality, an assum
>>> ption that is highly questionable especially when it comes to studying rapidly changing organisational fields.
>>>
>>> I have not yet read the manuscripts in the links below and have no relations to any of the authors listed in the original message to which Gene was responding to, but even in healthcare which has for centuries been dominated by the biomedical scientific model, with randomised control trials being long regarded as "the gold standard" in evidence-based medicine regarding efficacies of clinical interventions, there have been a growing recognition in the past 2 decades about the importance of qualitative research in informing practice within the framework of evidence-based healthcare. Articles in the Lancet have already extolled the importance of qualitative research in informing clinical practice more than a decade ago; and this is not counting the amount of Education and Debate articles published in the British Medical Journal and other premier healthcare journals in recent decades that have recognised the contribution of qualitative research to improving our understanding of how heal
>>> thcare works in practice. And what is a causal explanation if not a model of how something works? Qualitative research may be completely useless at generating evidence regarding "what works" at the macro, systemic level, especially for biological rather than interpretive systems, for which we still require, and indeed demand, experimental methods that can give us 'cast-iron' evidence; but for generating evidence regarding how something works at the micro level, especially for interpretive systems such as organisations, we still require field research and inductive, naturalistic inquiry, which is what qualitative research offers.
>>>
>>> Some pertinent references:
>>>
>>> Blumer, Herbert. 1969[1998]. Symbolic Interactionism: Perspectives and Method. Berkeley: University of California Press.
>>>
>>> Daft, Richard L. and Weick, Karl E. 1984. Toward a model of organizations as interpretation systems. Academy of Management Review. 9(2): 284-295.
>>>
>>> Grypdonck, Maria H. 2006. Qualitative health research in the era of evidence-based practice. Qualitative Health Research. 16: 1371-1385.
>>>
>>> Hatch, M.J. and Yanow, D. 2003. Organization theory as interpretive science. In Tsoukas, H. and Knudsen, C. (eds.) The Oxford Handbook of Organization Theory. Oxford: Oxford University Press.
>>>
>>> Hurwitz, Brian. 2000. Narrative and the practice of medicine. The Lancet. 356: 2086-2089.
>>>
>>> Lambert, Helen, Gordon, Elisa J.; and Bogdan-Lovis, Elizabeth A. 2006. “Gift horse or Trojan horse? Social science perspectives on evidence-based health care.” Social Science and Medicine. 62(11): 2613–2620.
>>>
>>> Malterud, Kirsti. 2001. The art and science of clinical knowledge: evidence beyond measures and numbers. The Lancet. 358: 397-400.
>>>
>>> Petticrew, M. and Roberts, H. 2003. ‘Evidence, hierarchies, and typologies: horses for courses.’ Journal of Epidemiology and Community Health. 57: 527-529.
>>>
>>>
>>> Kind regards,
>>>
>>> Mandy
>>>
>>> ----------------------------------------
>>> Mandy S. Lee
>>> Health Policy and Management
>>> School of Medicine
>>> Trinity College Dublin
>>>
>>> ________________________________________
>>> From: Gene & Dianna [wolfcub1@satx.rr.com]
>>> Sent: 12 September 2012 15:46
>>> To: RMNET
>>> Subject: Re: Causality in qualitative research
>>>
>>> Notwithstanding the views of the below-noted authors, causal inferences are virtually never justified in qualitative research. However, research of this type may serve as a basis for hypothesizing causal relations between variables. For more on this see:
>>>
>>>
>>> Shadish, W. R., Cook, T. D., & Campbell, D. T. (2001). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton-Mifflin.
>>>
>>>
>>>
>>> Stone-Romero, E. F. (2010). Research strategies in industrial and organizational psychology: Nonexperimental, quasi-experimental, and randomized experimental research in special purpose and nonspecial purpose settings. In S. Zedeck (Ed.), Handbook of industrial and organizational psychology (pp. 35-70). Washington, DC: American Psychological Association Press.
>>>
>>>
>>> Best regards,
>>>
>>> Gene
>>>
>>> Eugene F. Stone-Romero
>>> wolfcub1@satx.rr.com<mailto:wolfcub1@satx.rr.com>
>>> ----- Original Message -----
>>> From: Michael James Zyphur<mailto:mzyphur@unimelb.edu.au>
>>> To: RMNET<mailto:rmnet@listserv.unc.edu>
>>> Sent: Tuesday, September 11, 2012 1:41 AM
>>> Subject: Causality in qualitative research
>>>
>>> Hi Netters,
>>> In case it is of interest, there is a recent special issue on causality in qualitative studies that might be of interest to those pursuing methods of causal inference:
>>>
>>>
>>> Special Section on Causality
>>>
>>> *
>>> * Robert Donmoyer
>>> * Attributing Causality in Qualitative Research: Viable Option or Inappropriate Aspiration? An Introduction to a Collection of Papers
>>> Qualitative Inquiry October 2012 18: 651-654, doi:10.1177/1077800412455012
>>> * Abstract<http://qix.sagepub.com.ezp.lib.unimelb.edu.au/content/18/8/651.abstract> Full Text (PDF)<http://qix.sagepub.com.ezp.lib.unimelb.edu.au/content/18/8/651.full.pdf+html> References<http://qix.sagepub.com.ezp.lib.unimelb.edu.au/content/18/8/651.refs> Request Permissions<https://s100.copyright.com/AppDispatchServlet?publisherName=sage&publication=J323&title=Attributing%20Causality%20in%20Qualitative%20Research%3AViable%20Option%20or%20Inappropriate%20Aspiration%3F%20An%20Introduction%20to%20a%20Collection%20of%20Papers&publicationDate=09%2F06%2F2012&author=Robert%20Donmoyer&startPage=651&contentID=10.1177%2F1077800412455012&orderBeanReset=true&publicationType=SS&trimSize=5.5x8.5©right=SAGE%20Publications&isn=1077-8004&endPage=654&volumeNum=18&issueNum=8&article_permission=yes>
>>> *
>>> Select this article
>>> * Joseph A. Maxwell
>>> * The Importance of Qualitative Research for Causal Explanation in Education
>>> Qualitative Inquiry October 2012 18: 655-661, doi:10.1177/1077800412452856
>>> * Abstract<http://qix.sagepub.com.ezp.lib.unimelb.edu.au/content/18/8/655.abstract> Full Text (PDF)<http://qix.sagepub.com.ezp.lib.unimelb.edu.au/content/18/8/655.full.pdf+html> References<http://qix.sagepub.com.ezp.lib.unimelb.edu.au/content/18/8/655.refs> Request Permissions<https://s100.copyright.com/AppDispatchServlet?publisherName=sage&publication=J323&title=The%20Importance%20of%20Qualitative%20Research%20for%20Causal%20Explanation%20in%20Education%3A&publicationDate=09%2F06%2F2012&author=Joseph%20A.%20Maxwell&startPage=655&contentID=10.1177%2F1077800412452856&orderBeanReset=true&publicationType=SS&trimSize=5.5x8.5©right=SAGE%20Publications&isn=1077-8004&endPage=661&volumeNum=18&issueNum=8&article_permission=yes>
>>> *
>>> Select this article
>>> * Robert Donmoyer
>>> * Can Qualitative Researchers Answer Policymakers’ What-Works Question?
>>> Qualitative Inquiry October 2012 18: 662-673, doi:10.1177/1077800412454531
>>> * Abstract<http://qix.sagepub.com.ezp.lib.unimelb.edu.au/content/18/8/662.abstract> Full Text (PDF)<http://qix.sagepub.com.ezp.lib.unimelb.edu.au/content/18/8/662.full.pdf+html> References<http://qix.sagepub.com.ezp.lib.unimelb.edu.au/content/18/8/662.refs> Request Permissions<https://s100.copyright.com/AppDispatchServlet?publisherName=sage&publication=J323&title=Can%20Qualitative%20Researchers%20Answer%20Policymakers%E2%80%99%20What-Works%20Question%3F%3A&publicationDate=09%2F06%2F2012&author=Robert%20Donmoyer&startPage=662&contentID=10.1177%2F1077800412454531&orderBeanReset=true&publicationType=SS&trimSize=5.5x8.5©right=SAGE%20Publications&isn=1077-8004&endPage=673&volumeNum=18&issueNum=8&article_permission=yes>
>>> *
>>> Select this article
>>> * Gary L. Anderson and Janelle Scott
>>> * Toward an Intersectional Understanding of Process Causality and Social Context
>>> Qualitative Inquiry October 2012 18: 674-685, doi:10.1177/1077800412452857
>>> * Abstract<http://qix.sagepub.com.ezp.lib.unimelb.edu.au/content/18/8/674.abstract> Full Text (PDF)<http://qix.sagepub.com.ezp.lib.unimelb.edu.au/content/18/8/674.full.pdf+html> References<http://qix.sagepub.com.ezp.lib.unimelb.edu.au/content/18/8/674.refs> Request Permissions<https://s100.copyright.com/AppDispatchServlet?publisherName=sage&publication=J323&title=Toward%20an%20Intersectional%20Understanding%20of%20Process%20Causality%20and%20Social%20Context%3A&publicationDate=09%2F06%2F2012&author=Gary%20L.%20Anderson%2C%20Janelle%20Scott&startPage=674&contentID=10.1177%2F1077800412452857&orderBeanReset=true&publicationType=SS&trimSize=5.5x8.5©right=SAGE%20Publications&isn=1077-8004&endPage=685&volumeNum=18&issueNum=8&article_permission=yes>
>>> *
>>> Select this article
>>> * Frederick Erickson
>>> * Comments on Causality in Qualitative Inquiry
>>> Qualitative Inquiry October 2012 18: 686-688, doi:10.1177/1077800412454834
>>> * Full Text (PDF)<http://qix.sagepub.com.ezp.lib.unimelb.edu.au/content/18/8/686.full.pdf+html> References<http://qix.sagepub.com.ezp.lib.unimelb.edu.au/content/18/8/686.refs> Request Permissions<https://s100.copyright.com/App
>>> DispatchServlet?publisherName=sage&publication=J323&title=Comments%20on%20Causality%20in%20Qualitative%20Inquiry%3A&publicationDate=09%2F06%2F2012&author=Frederick%20Erickson&startPage=686&contentID=10.1177%2F1077800412454834&orderBeanReset=true&publicationType=SS&trimSize=5.5x8.5©right=SAGE%20Publications&isn=1077-8004&endPage=688&volumeNum=18&issueNum=8&article_permission=yes>
>>>
>>>
>>>
>>> --
>>> When you come to a fork in the road, take it. --- Yogi Berra
>>>