Wednesday, March 17. 2010
In announcing Alma Swan's Review of Studies on the Open Access Impact Advantage, I had suggested that the growing number of studies on the OA Impact Advantage were clearly ripe for a meta-analysis. Here is an update:
David Wilson wrote:
"Interesting discussion. Phil Davis has a limited albeit common view of meta-analysis. Within medicine, meta-analysis is generally applied to a small set of highly homogeneous studies. As such, the focus is on the overall or pooled effect with only a secondary focus on variability in effects. Within the social sciences, there is a strong tradition of meta-analyzing fairly heterogeneous sets of studies. The focus is clearly not on the overall effect, which would be rather meaningless, but rather on the variability in effect and the study characteristics, both methodological and substantive, that explain that variability.
"I don't know enough about this area to ascertain the credibility of [Phil Davis's] criticism of the methodologies of the various studies involved. However, the one study that [Phil Davis] claims is methodologically superior in terms of internal validity (which it might be) is clearly deficient in statistical power. As such, it provides only a weak test. Recall, that a statistically nonsignificant finding is a weak finding -- a failure to reject the null and not acceptance of the null.
"Meta-analysis could be put to good use in this area. It won't resolve the issue of whether the studies that Davis thinks are flawed are in fact flawed. It could explore the consistency in effect across these studies and whether the effect varies by the method used. Both would add to the debate on this issue."Lipsey, MW & Wilson DB (2001) Practical Meta-Analysis. Sage. David B. Wilson, Ph.D.
Associate Professor
Chair, Administration of Justice Department
George Mason University
10900 University Boulevard, MS 4F4
Manassas, VA 20110-2203
Added Mar 15 2010
See also (thanks to Peter Suber for spotting this study!):
Wagner, A. Ben (2010) Open Access Citation Advantage: An Annotated Bibliography. Issues in Science and Technology Librarianship. 60. Winter 2010
On Mar 12, 2010 Gene V Glass wrote the following:
"Far more issues about OA and meta analysis have been raised in this thread for me to [be able to] comment on. But having dedicated 35 years of my efforts to meta analysis and 20 to OA, I can’t resist a couple of quick observations.
Holding up one set of methods (be they RCT or whatever) as the gold standard is inconsistent with decades of empirical work in meta analysis that shows that “perfect studies” and “less than perfect studies” seldom show important differences in results. If the question at hand concerns experimental intervention, then random assignment to groups may well be inferior as a matching technique to even an ex post facto matching of groups. Randomization is not the royal road to equivalence of groups; it’s the road to probability statements about differences.
Claims about the superiority of certain methods are empirical claims. They are not a priori dicta about what evidence can and can not be looked at."Glass, G.V.; McGaw, B.; & Smith, M.L. (1981). Meta-analysis in Social Research. Beverly Hills, CA: SAGE.
Rudner, Lawrence, Gene V Glass, David L. Evartt, and Patrick J. Emery (2000). A user's guide to the meta-analysis of research studies. ERIC Clearinghouse on Assessment and Evaluation, University of Maryland, College Park.
GVG's Publications
|