The scaleable, sustainable solution for global OA is for each author's own mandated institutional repository to be the designated locus of deposit for all published articles. These can of course also be exported to any other locus desired (actually only the link need be exported, once metadata interoperability is ensured).
Arxiv depositors will of course be able to keep on depositing directly in Arxiv as long as they wish. Why not? They were, after all, among the first wave of OA providers, and have been faithfully doing it for decades, unmandated. Their Arxiv deposits can instead be harvested back to their institutions instead of trying to make these heroic depositors change their long-standing and progressive habits because other disciplines didn't have the sense to do it unmandated,
But it remains true that today most papers (across all disciplines) are not being deposited in Arxiv, nor in institutional repositories, nor deposited anywhere within the first year of publication. Mandates from institutions and funders will remedy that.
But for mandates to be effective, they must demand minimal effort from authors and institutions, and it must be possible to monitor and ensure compliance.
The simplest and surest way to monitor and ensure compliance is for both institutions and funders to require convergent deposit in the author's institutional repository. That covers all papers, funded and and unfunded (except the tiny minority by institutionally unaffiliated authors, who can deposit directly in institution-external repositories),
On the web, distributed locus of deposit does not "fragment the literature." No one deposits directly in google; google harvests. Google currently inverts all data and still has the best search functionality.
Once enough of the OA corpus is deposited in institutional repositories (IRs) to make it worthwhile bothering, it will be a piece of cake for an enterprising grad student to write the harvest and search code across the global network of OA IRs, and generations of grad students will continue optimizing these tools beyond even the imagination of today's sluggish, non-depositing scholarly and scientific researcher community...
CHALLENGES TO THE RULE OF LAW IN THE EUROPEAN UNION: THE CASE OF HUNGARY
(March 3rd 2015 Concordia University, Montreal, Canada)
Professor Kim Lane Scheppele, Princeton University, USA
Professor András Bozóki, Central European University, Hungary
Professor András Göllner, Concordia University, Canada
+ Hungarian Government Response (Lajos Olah, Deputy Head of Mission, Hungarian Embassy, Ottawa)
How Hungary's Orban Regime departed from democracy in the EU
(Prof A Bozoki, Central European University) (1 of 4 videos)
How the EU can help restore Democracy in Hungary
(Prof KL Scheppele, Princeton University) (2 of 4 videos)
Democracy 101
(Prof. A Gollner, Concordia University) (3 of 4 videos)
Many physicists say — and some may even believe — that peer review does not add much to their work, that they would do fine with just unrefereed preprints, and that they only continue to submit to peer-reviewed journals because they need to satisfy their promotion/evaluation committees.
And some of them may even be right. Certainly the giants in the field don’t benefit from peer review. They have no peers, and for them peer-review just leads to regression on the mean.
But that criterion does not scale to the whole field, nor to other fields, and peer review continues to be needed to maintain quality standards. That’s just the nature of human endeavor.
And the quality vetting and tagging is needed before you risk investing the time into reading, using and trying to build on work -- not after. (That's why it's getting so hard to find referees, why they're taking so long (and often not doing a conscientious enough job, especially for journals whose quality standards are at or below the mean.)
Open Access means freeing peer-reviewed research from access tolls, not freeing it from peer review...
Harnad, S. (1998/2000/2004) The invisible hand of peer review.Nature [online] (5 Nov. 1998), Exploit Interactive 5 (2000): and in Shatz, B. (2004) (ed.) Peer Review: A Critical Inquiry. Rowland & Littlefield. Pp. 235-242. http://cogprints.org/1646/
Harnad, S. (2009) The PostGutenberg Open Access Journal. In: Cope, B. & Phillips, (Eds.) The Future of the Academic Journal. Chandos. http://eprints.soton.ac.uk/265617/
All the author opinions cited by U. Utah librarian Rick Anderson in his recent UKSG squib are familiar ones, based largely on author ignorance. Their rebuttals have been known for years (e.g., the self-archiving FAQ since 2001 and even earlier in the AmSci OA Forum). Most are covered in this:
The very same prima facie author objections would no doubt have been voiced if authors had been polled in advance on the (universal) mandate to publish or perish.
Although it’s unclear what his underlying motivation is, Utah librarian Rick Anderson has consistently sounded like a publisher’s advocate (or subscription agent!) for years and years now, and in his UKSG squib he is simply citing the persistence of author ignorance and the status quo as evidence and justification for the persistence of author ignorance and the status quo!
The remedy, of course, is effective global Green OA mandates.
Green OA and Green OA mandates grow anarchically, article by article and institution/funder by institution/funder, rather than journal by journal. So journals can only be cancelled once all or almost all of their contents are accessible via Green OA — and that day arrives only when Green OA and effective Green OA mandates have become global and are generating full or almost full compliance.
1. The US government denies entry to high Hungarian officials, including the head of the Hungarian IRS, a personal friend of the prime minister, Viktor Orban, for corruption (e.g., what amounts to demanding bribes from US companies for doing business in Hungary).
2. Orban (who calls all the shots in what he calls his “illiberal state”), instead of honestly and transparently investigating the corruption charges, demands that the US goverrnment do the investigation and provide the evidence, accuses the US of trying to manipulate Hungary for US purposes, and publicly orders the head of the Hungarian IRS to either sue the American embassy chargeé d’affaires (the US messenger) for defamation or be fired from her job.
3. And now Orban extends an “olive branch”: “Let’s let bygones be bygones. Forget these corruption charges. Back to business as usual.”
There is something profoundly rotten going on in Hungary these days. Media control and other shenanigans have so far prevented the electorate from smelling it, for two terms, but by now the stench is becoming overwhelming internationally, and it’s even beginning to get through to the noses of the Hungarian citizenry, who have been demonstrating nonviolently in growing numbers for Orban’s ouster.
Orban, with his US “olive branch” in one hand, has publicly floated threats to amend the lawshttp://openaccess.eprints.org/ of public assembly to put an end to this public unrest as part of a “national defence plan” to protect Hungary from the foreign forces fomenting these expressions of dissatisfaction from his unruly citizenry for their sinister, anti-Hungarian purposes...
Today's transitional period for peer-reviewed journal publishing -- when both the price of subscribing to conventional journals and the price of publishing in open-access journals ("Gold OA") is grossly inflated by obsolete costs and services -- is hardly the time to inflate costs still further by paying peer reviewers.
Institutions and funders need to mandate the open-access self-archiving of all published articles first ("Green OA"). This will make subscriptions unsustainable, forcing journals to downsize to providing only peer review, leaving access-provision and archiving to the distributed global network of institutional repositories. The price per submitted paper of managing peer review -- since peers review, and always reviewed for free -- is low, fair, affordable and sustainable, on a no-fault basis (irrespective of whether the paper is accepted or rejected: accepted authors should not have to subsidize the cost of rejected papers).
Let's get there first, before contemplating whether we really want to raise that cost yet again, this time by paying peers.
It would be a lot better if the Netherlands adopted a policy requiring Dutch researchers to make their published research OA rather than fussing about publishing costs and the costs of OA publishing.
Let me add a suggestion, updated for REF2014, that I have made before (unheeded):
Scientometric predictors of research performance need to be validated by showing that they have a high correlation with the external criterion they are trying to predict. The UK Research Excellence Framework (REF) -- together with the growing movement toward making the full-texts of research articles freely available on the web -- offer a unique opportunity to test and validate a wealth of old and new scientometric predictors, through multiple regression analysis: Publications, journal impact factors, citations, co-citations, citation chronometrics (age, growth, latency to peak, decay rate), hub/authority scores, h-index, prior funding, student counts, co-authorship scores, endogamy/exogamy, textual proximity, download/co-downloads and their chronometrics, tweets, tags, etc.) can all be tested and validated jointly, discipline by discipline, against their REF panel rankings in REF2014. The weights of each predictor can be calibrated to maximize the joint correlation with the rankings. Open Access Scientometrics will provide powerful new means of navigating, evaluating, predicting and analyzing the growing Open Access database, as well as powerful incentives for making it grow faster.
Harnad, S. (2009) Open Access Scientometrics and the UK Research Assessment Exercise. Scientometrics 79 (1)
(Also in Proceedings of 11th Annual Meeting of the International Society for Scientometrics and Informetrics 11(1), pp. 27-33, Madrid, Spain. Torres-Salinas, D. and Moed, H. F., Eds. 2007)
REF2014 gives the 2014 institutional and departmental rankings based on the 4 outputs submitted.
That is then the criterion against which the many other metrics I list below can be jointly validated, through multiple regression, to initialize their weights for REF2020, as well as for other assessments. In fact, open access metrics can be — and will be — continuously assessed, as open access grows. And as the initialized weights of the metric equation (per discipline) are optimized for predictive power, the metric equation can replace the peer rankings (except for periodic cross-checks and updates) -- or at least supplement it.
Single metrics can be abused, but not only can abuses be named and shamed when detected, but it becomes harder to abuse metrics when they are part of a multiple, inter-correlated vector, with disciplinary profiles of their normal interactions: someone dispatching a robot to download his papers would quickly be caught out when the usual correlation between downloads and later citations fails to appear. Add more variables and it gets even harder.
In a weighted vector of multiple metrics like the sample I had listed, it’s no use to a researcher if told in advance that for REF2020 the metric equation will be the following, with the following weights for their particular discipline:
Comment on: Mryglod, Olesya, Ralph Kenna, Yurij Holovatch and Bertrand Berche (2014) Predicting the results of the REF using departmental h-index: A look at biology, chemistry, physics, and sociology. LSE Impact Blog 12(6)
The topic of using metrics for research performance assessment in the UK has a rather long history, beginning with the work of Charles Oppenheim.
The solution is neither to abjure metrics nor to pick and stick to one unvalidated metric, whether it’s the journal impact factor or the h-index.
The solution is to jointly test and validate, field by field, a battery of multiple, diverse metrics (citations, downloads, links, tweets, tags, endogamy/exogamy, hubs/authorities, latency/longevity, co-citations, co-authorships, etc.) against a face-valid criterion (such as peer rankings).
Oppenheim, C. (1996). Do citations count? Citation indexing and the Research Assessment Exercise (RAE). Serials: The Journal for the Serials Community, 9(2), 155-161.
Oppenheim, C. (1997). The correlation between citation counts and the 1992 research assessment exercise ratings for British research in genetics, anatomy and archaeology. Journal of documentation, 53(5), 477-487.
Oppenheim, C. (1995). The correlation between citation counts and the 1992 Research Assessment Exercise Ratings for British library and information science university departments. Journal of Documentation, 51(1), 18-27.
Oppenheim, C. (2007). Using the h-index to rank influential British researchers in information science and librarianship. Journal of the American Society for Information Science and Technology, 58(2), 297-301.
Harnad, S. (2009) Open Access Scientometrics and the UK Research Assessment Exercise. Scientometrics 79 (1) Also in Proceedings of 11th Annual Meeting of the International Society for Scientometrics and Informetrics 11(1), pp. 27-33, Madrid, Spain. Torres-Salinas, D. and Moed, H. F., Eds. (2007)
The American Scientist Open Access Forum has been chronicling and often directing the course of progress in providing Open Access to Universities' Peer-Reviewed Research Articles since its inception in the US in 1998 by the American Scientist, published by the Sigma Xi Society.
The Forum is largely for policy-makers at universities, research institutions and research funding agencies worldwide who are interested in institutional Open Acess Provision policy. (It is not a general discussion group for serials, pricing or publishing issues: it is specifically focussed on institutional Open Acess policy.)